Inspiration

We were inspired by the idea that sign language is inherently physical and spatial, but most beginner learning tools are still flat. Videos and pictures can show what a sign looks like, but they do not always make it easy to understand where your fingers should go in 3D space. We wanted to explore whether mixed reality could make sign learning feel more intuitive by letting users directly compare their own hand to a virtual example.

What it does

Our project is a mixed reality prototype for beginner sign practice on Meta Quest 3. The user sees a ghost hand floating in front of them representing a target sign, while their own hand is tracked live in the scene. The system checks whether the user's fingertips align closely enough with the target pose and provides feedback when the sign is matched. We also support multiple hand poses and simple sign navigation, turning the experience into an interactive learning flow rather than a static demo.

How we built it

We built the project in Unity using the Meta XR SDKs, passthrough mixed reality, and hand tracking on Quest 3. We used Meta's Building Blocks to quickly scaffold the camera rig, passthrough, and tracked hand setup. From there, we created custom ghost hand poses for individual signs and placed fingertip targets on the ghost hand. On the live tracked hand, we used the runtime hand skeleton data to access fingertip positions and compare them against those targets. We then added a simple UI panel and logic to switch between signs and display matching feedback.

Challenges we ran into

One major challenge was the development workflow itself. Setting up Unity, Meta XR packages, OpenXR, passthrough, and hand tracking took significant time, and long shader compilation/build times became a recurring bottleneck. We also ran into friction with the abstraction introduced by Building Blocks, since they are great for fast setup but can make it harder to directly access lower-level hand data and hierarchy details. Another challenge was deciding what to simplify. Full sign recognition is a much harder problem than a hackathon allows for, so we had to intentionally narrow scope and focus on fingertip-based pose alignment as a strong proof of concept.

Accomplishments that we're proud of

We are proud that we were able to get a full mixed reality hand-tracking pipeline working on Quest 3 and turn it into a meaningful demo rather than just a technical experiment. In a short amount of time, we built a system that renders a target ghost hand, tracks the user's real hand live, compares fingertip positions, and provides feedback on whether the pose has been matched. We also expanded the prototype to support multiple signs, which helped transform the project from a single-scene proof of concept into something that more clearly communicates its educational potential.

What we learned

We learned a lot about how to rapidly prototype mixed reality interactions under time pressure. A big lesson was that the core experience matters more than perfect technical elegance in a hackathon setting. Instead of trying to solve full sign language recognition, we focused on a believable and intuitive interaction loop: see the target, match it, get feedback. We also learned how hand tracking data is structured in Meta's SDK, how to work with runtime skeleton bones, and how quickly XR projects can become difficult to manage if we overcomplicate the architecture.

What's next for SignGuide AR

The next step would be making the system more expressive and educational. That could include more signs, cleaner pose transitions, better feedback about which fingers are wrong, and more robust pose recognition beyond fingertip alignment. We would also want to improve the presentation and polish of the ghost hand, reduce build and performance issues, and explore how this could grow into a genuinely useful tool for early sign language learners.

Built With

  • c#
  • git
  • github
  • hand-tracking
  • meta-quest-3
  • meta-xr-interaction-sdk
  • meta-xr-sdk
  • openxr
  • passthrough-mixed-reality
  • textmeshpro
  • unity
  • unity-ui
Share this project:

Updates