Inspiration

Imagine a world where your office is wherever you want it to be—whether it’s perched on top of Half Dome in Yosemite, beside a serene lakeside, or even on the surface of the moon. This vision of extended reality (XR) became tantalizingly real during the Apple Vision Pro demo, which showcased XR’s potential to revolutionize remote work by bringing immersive virtual workspaces anywhere. But one limitation quickly surfaced: XR keyboards simply can’t keep up with the rapid pace of user ideation, struggling to reach even 30 words per minute despite maximum typing effort. This gap highlighted the need for a more efficient input method that aligns with the fluid, immersive nature of XR.

Trillium Lake

As enthusiastic users of productivity tools like Arc Browser, Magnet, and VIM, we’ve experienced firsthand the power and satisfaction of mastering keyboard shortcuts, which can double our speed in performing tasks. For instance, using ChatGPT search within Arc Browser becomes incredibly efficient with the right shortcuts, enhancing productivity and reducing friction. This experience showed us the potential of powerful, intuitive controls to transform how we interact with digital environments.

Moreover, gestures in XR offer an exciting advantage: they introduce increased dimensionality and enable a more natural, seamless transition between actions. Unlike the limited, two-dimensional input of a keyboard and mouse, gestures allow for a three-dimensional interaction that feels instinctive and unobtrusive. This means that gestures can possibly send 100 times as much data as a mouse and keyboard can in a given moment. Combined with LLMs to interpret the intent of a gesture, we can generate large chunks of content from any given user intention. This opens up new possibilities for a fluid and responsive coding experience, where users can shift effortlessly between commands, creating a workspace that keeps pace with their ideas.

What it does

HandCode is an innovative tool designed to allow users to interact with code in a mixed-reality environment using hand gestures. Currently, users can use gestures to adjust the scope of code they wish to edit, and based on this selection, HandCode provides context-aware recommendations generated by a language model. This feature allows for intuitive, hands-free coding assistance. Looking ahead, we plan to expand HandCode’s functionality to enable it not only to edit but also to autonomously write code, enhancing productivity and making coding in mixed reality a more powerful experience.

How we built it

We built HandCode using Unity, a versatile game development engine, which allowed us to create an immersive and responsive mixed-reality interface. One of the primary components we developed is a code display window that presents code in an easily viewable format within the XR environment. We also created an adjustable highlighter box, which users can manipulate to select specific sections of code. This selection process determines the scope for editing, making it possible for HandCode to apply language model recommendations only to the relevant portion. Unity’s flexibility enabled us to combine these interactive elements into a cohesive, functional experience for users.

Challenges we ran into

One of the most challenging aspects was creating the adjustable highlighter box that users could control accurately with hand gestures. Achieving this required fine-tuning the system to recognize subtle hand movements, which is essential for precise selection of code segments. Additionally, implementing precise hand controls in a mixed-reality environment posed technical difficulties, as even slight inaccuracies could lead to user frustration. We invested significant effort in calibrating the controls to ensure a smooth, reliable experience.

Accomplishments that we're proud of

We’re proud of several accomplishments with HandCode. First, we successfully implemented gesture-based scope adjustment, enabling users to define specific areas of code simply by moving their hands. This was a key goal, and seeing it work smoothly in practice was immensely satisfying. We’re also proud of enabling a mixed-reality code editing experience that integrates language model (LLM) support. This feature provides users with intelligent suggestions directly within the XR space, marking a significant step toward a more interactive and intuitive coding process.

What we learned

This project was a tremendous learning experience for our team. It was our first time working with Unity, so we quickly became familiar with its development environment and capabilities. Additionally, it was our first foray into extended reality (XR) development, which required us to understand the unique design and technical considerations of working in a 3D interactive space. From adjusting to Unity’s programming model to designing for user experience in XR, every aspect of this project broadened our technical and creative skills.

What's next for HandCode

Our vision for HandCode includes several exciting future enhancements. First, we plan to implement a feature that allows users to refresh language model recommendations on-demand, so they can receive new suggestions as the code evolves. We also aim to incorporate a built-in version history feature, allowing users to navigate back and forth between different code versions, which can be especially useful in an XR environment where traditional file management might feel less accessible. Additionally, we envision adding functionality for users to annotate code by drawing around it with their fingers, creating a more natural way to mark sections and add notes. Finally, we’re working on a dropdown menu that displays instantiated variables, along with recommendations based on the selected variables, to streamline coding and improve productivity by keeping relevant information readily accessible.

Built With

Share this project:

Updates