We will be undergoing planned maintenance on January 16th, 2026 at 1:00pm UTC. Please make sure to save your work.

Inspiration

Third Seat was originally created to explore how VR can support emotional connection, reflection, and communication for couples. I wanted to design something that feels personal - an experience where you can talk openly, explore ideas safely, and interact with tools that help regulate your mind and body.

For this competition, I focused on pushing the app toward a fully hands-first interaction model. I’ve always loved when VR feels natural, intuitive, and gesture-based, so my goal was to replace UI clicks and controller inputs with meaningful micro gestures, somatic interactions, and fluid hand-tracked tools.

What I Built

This version introduces several major upgrades centered on advanced hand interactions, natural gesture input, and AI-driven communication tools.

New Hand Interaction Systems

Gesture-based teleportation locomotion A directional wrist gesture that activates teleportation (shown in the demo video). Currently implemented and in refinement due to subsystem integration work.

Gesture-based menus (“Reframe UI”) A palm-up posture for opening the Language Selection menu + "framed hands" to recenter UI in front of the player

Thumb-to-pinky “Speak to AI” gesture A natural hand signal that opens the microphone and routes speech into my Azure STT → GPT → TTS pipeline.

Somatic Regulation Tools

Hand-tracked stress ball and pillow Interact using squeezing, stretching, and soft manipulations that encourage self-regulation. Includes dynamic haptics for controllers and squeezable physics for hands.

What’s Working Today

The build submitted includes:

Full AI conversation loop in multiplayer

All somatic tools with hand tracking

Hand-tracked grabbing and UI interaction

What’s Implemented but Still Experimental

These features are actively implemented and demonstrated in the video, but currently undergoing refinement inside the OpenXR hand-tracking subsystem. They can be seen working in the video as they were working in editor could not produce a build with these features working by the deadline.

  • Thumb-to-pinky “Speak to Mira” gesture

  • Hand-tracked locomotion: thumbs up on either hand to rotate, both thumbs up to nudge backwards

  • Gesture-based teleportation with right palm up and thumb to index

  • Gesture-based Language selection menu

  • Gesture-triggered “Reframe” UI

These systems are real, functional features - just temporarily impacted by manifest merging and subsystem initialization conflicts. I chose to show them in the demo video because they represent the direction and ambition of the update.

How I built it

The project is built in Unity using:

  • OpenXR with Unity's Hand Tracking Subsystem

  • Normcore for multiplayer and synced avatars

  • Azure Speech Services for STT/TTS

  • OpenAI GPT models for AI-driven therapist dialogue

  • Custom gesture recognizers for micro-gestures and palm-rotation states

  • Custom UX systems for somatic tools, haptic feedback, social cues, and synchronized session states

One of the biggest challenges was aligning all these systems - hand tracking, networking, speech, and AI - into a cohesive, responsive simulation. Integrating the Hand Tracking Subsystem required deep debugging of manifests, XR initialization, and input routing, and the competition gave me the push to integrate more gestures and hands-first locomotion.

Challenges I ran into

The biggest challenge was integrating multiple layers of hand tracking (Meta Aim + OpenXR subsystem) with custom gestures while also managing a custom Android manifest for STT, TTS, OpenXR, and hand tracking permissions. This led to conflicts, black-screen boots, and subsystem initialization failures that required deep debugging.

But this process pushed me to better understand Horizon OS, manifest merging, and low-level OpenXR boot flows - something I’ll continue refining after the competition.

Accomplishments that I'm proud of

Despite the tight timeline and the complexity of integrating multiple XR systems, I achieved several major milestones that significantly push my project forward:

Hand-first interaction overhaul - I successfully implemented natural, intuitive hand interactions across the entire experience, including direct UI manipulation, gesture-based confirmation inputs, and grab-based interactions for somatic regulation tools. This shifted the project toward the hands-first direction Meta is encouraging.

Somatic tools reimagined for hand tracking - The stress ball, pillow, and grounding objects now fully support hand tracking, complete with stretch/squeeze logic, positional feedback, and tactile audio cues that help players regulate emotional load in-experience.

Gesture-based navigation system - I built a full suite of gesture-driven locomotion and UI features, including pinch-to-teleport, palm-up menus, two-thumb “nudge back,” and palm-facing reframing gestures. These interactions all worked consistently in-editor and demonstrated the potential for completely controller-free exploration.

Resilient problem-solving under pressure - When the hand-tracking subsystem introduced unexpected build-level issues, I created a fallback build to ensure the project remains fully playable while continuing to prototype and refine the advanced gesture system. I also documented and demonstrated the working gestures within the submission video so judges can see the intended feature set.

These accomplishments reflect significant progress toward a fully hand-first VR experience and show the foundation for an even more robust build in the near future.

What I learned

  • OpenXR hand tracking pipelines

  • Gesture recognition patterns

  • Multipass manifest merging in Unity

  • Designing intuitive microgestures

  • Building somatic interaction tools for hands

  • Horizon OS debugging with logcat & symbolic traces

Even with challenges, this update significantly transformed Third Seat and brought me closer to a fully hands-first, emotionally supportive VR experience.

What's next for Third Seat

The next phase of development is focused on bringing all tested and functional gesture systems from the Unity editor into a fully integrated, stable build for Meta Quest.

Several major features such as gesture-based teleportation, gesture-activated menus, and gesture snap rotation are already implemented and working reliably in-editor. The immediate goal is to finish reconciling Unity’s OpenXR hand-tracking subsystem with my custom gesture pipeline and Android manifest, ensuring these interactions initialize correctly on-device.

Once the subsystem and manifest integration are complete, I will:

Finalize the fully hands-first locomotion system (gesture-based teleportation, palm locomotion, and rotation gestures)

Tune gesture recognition thresholds for reliability across users

This competition update laid the foundation for all of these systems; the next update will unify them into a seamless, intuitive, hands-first experience ready for public release.

Built With

Share this project:

Updates