-
-
Edgelord Samurai Logo
-
Anime Story Intro
-
Thrown Bamboo Cutting (Tameshigiri)
-
Basic Swinging Tutorial
-
New Thrown Party Mode
-
In air Computer Vision tracking
-
Successful cut of thrown tracked object
-
Edgelord Inspirational Sketch
-
Edgelord Samurai Orange
-
Katana Controller
-
Teaching Samurai Swordsmanship at SensAI Hackathon
-
Learning proper Sword Stances (Kamae)
Inspiration:
Edgelord Samurai began with the original Katana Samurai demo created by Richard Borys, a simple single player bamboo slicing experience in virtual reality with an immersive storyline. It follows a young son, grieving the loss of his father who was killed in the war, as he is recruited for Samurai training by the kingdom to honor his father’s legacy and prepare for the growing conflict. In the game experience, it felt great to land a clean cut, but it was a lonely with no shared energy or spectators. For the SensAI hackathon, our team asked how we could take that feeling of studying the blade and turn it into a mixed reality dojo party in a living room.
We looked to the discipline of Japanese sword training where posture, breath, and precise cuts matter, and combined it with the instant fun of games like Fruit Ninja and the social chaos of couch party games. The result is a project that blends serious dojo focus with friends laughing, throwing a giant orange ball, and calling out for one more round.
What it does:
Edgelord Samurai turns any room into a shared mixed reality dojo on Meta Quest. The player wears the headset and holds a katana style controller. A friend throws a giant orange foam ball through the play space. Our system uses OpenCV color tracking on the passthrough video to lock onto the ball and follow its movement in real time.
In the headset, the ball becomes the anchor point for virtual fruit, targets, and impact effects that align with the real world motion. When the player swings the katana through that space, they see satisfying slices, particles, and scoring feedback that feel tied directly to the physical throw. Multiple Meta Quest headsets can join the same session so other players and spectators see the same dojo, the same ball path, and the same hits from different viewpoints.
How we built it:
We built Edgelord Samurai in Unity on Meta Quest using passthrough mixed reality and OpenCV. The headset shows the real room, and we layer a stylized dojo environment over it. We use a bright orange foam ball as a physical training orb and feed the passthrough frames into OpenCV.
By defining minimum and maximum values for the orange color, OpenCV isolates the ball from the rest of the scene and calculates its position in each frame. Unity receives that position and attaches virtual targets, trails, and hit effects to it so the physical ball and the digital effects feel like one object. The katana controller is treated as a training sword. The game checks when its swing path intersects the tracked ball and triggers slices, sound, and when the timing and placement are correct. On top of this, we allowed the player to choose between a single player dojo mode and a party mode with a thrower and a cutter.
Challenges we ran into:
We discovered that tracking a real object in mixed reality is very sensitive to the environment. Different lighting, wall colors, and clothing made it harder for OpenCV to cleanly isolate the orange ball. We spent time tuning the color ranges, smoothing detection, and reducing jitter, so the ball would stay stable on screen.
We also had to balance responsiveness and reliability. If tracking reacted too fast, it created noisy movement. If it was too strict, the ball would occasionally pop in and out of view. Finally, we had practical challenges around safety and hardware, such as designing a katana prop that felt good to swing while keeping enough distance between the thrower, the player, and the surrounding furniture.
Accomplishments that we are proud of:
We are proud that this is a true upgrade to an existing project. The original Katana Samurai was a single player bamboo slicer. Edgelord Samurai turns that foundation into a social mixed reality dojo with physical props, shared presence, and multiple modes. It moves from a solo toy toward a party game where people take turns and cheer for each other.
We are also proud of how much we achieved with relatively simple technology. Careful OpenCV color tracking and good game feel made a foam ball feel like a magical training orb that bridges the real and digital worlds. Seeing multiple people in Meta Quest headsets react to the same throw, shout when someone lands a clean cut, and immediately want another turn is the best proof that the concept works.
What we learned:
We learned that mixed reality sits at the intersection of computer vision, game design, and physical space. Simple tools like color tracking can create powerful illusions when combined with thoughtful feedback and real objects. At the same time, MR forced us to pay close attention to safety, comfort, and clarity because players are moving their bodies in real rooms, not just using thumb sticks.
We also learned that social presence multiplies engagement. The moment we added spectators, and turn taking, the project felt more alive. Finally, we learned how important it is to design within real constraints like noisy environments and limited hackathon time, and to prioritize features that work reliably in front of judges and players.
What is next for 700 SF13 Edgelord Samurai:
Next, we plan to fast follow into the Meta Horizon Start Developer Competition next week with an improved build of Edgelord Samurai. Our focus will be on strengthening tracking across more environments, polishing the visual presentation of the dojo, tightening the onboarding so new players can start quickly, and refining the core loop so every throw and every cut feels intentional, readable, and fun in a mixed reality setting.
Log in or sign up for Devpost to join the conversation.