Inspiration

While working on an internship project this summer, one of our members spent 2 weeks on a large commit. Meanwhile, their teammate merged an 18-commit MR to the same codebase. Unfortunately, Gitlab forced them to rebase their 2 weeks of work onto 18 commits, where we had to resolve merge conflicts one-by-one, while remembering changes from the previous countless decisions. This distasteful experience wasn't a one time thing. At various hackathons, version control issues have resulted in an unfairly large amount of lost time, often taking up 4-6 hours of everyones time. We decided to take that time back, by visualizing version control conflicts, and leveraging LLMs to resolve the tedious rebase conflicts without relying on a sleepless hacker's strained memory.

What it does

We have created a development tool that represents git trees in the AR environment of the Snap Spectacles. Using its hand tracking, we enable the user to move commits around with their hands, even creating merge requests when the user brings two commits together. To make the commits more understandable, we incorporated Google Gemini to summarize code differences between commits, making complex Git visualization tasks easier, to the point where you can consider it an educational tool.

How we built it

We built our app for Snapchat AR Spectacles using Snap AR’s Lens Studio. Our backend is powered by Python and FastAPI, ensuring fast and reliable data processing. We integrated Google Gemini to generate natural language representations of the code changes that we are modeling in addition to generating descriptions for the pull requests that our tool creates. We also connected our backend to the GitHub API for real-time repository interactions and command handling. To optimize our development workflow, we leveraged other dev tools like Perplexity for efficient research and Warp for streamlined terminal operations.

Challenges we ran into

The largest hurdle was working with the newly released Snap AR 2024 Spectacles, the lack of up-to-date tutorials and incomplete documentation made it difficult to fully understand and use certain features, slowing down our progress as we navigated the new toolset. Almost all of us were new to AR/VR/XR development, and none of us had ever used Snap's Lens Studio. Learning and implementing a new Lens was challenging under the tight deadlines, we had hours of trial and error on the most trivial tasks... Fortunately, there were tons of great people at the Snap booth and on Slack that were very helpful. Our application also had to manage synchronous environment while ensuring the system maintained low latency, making real-time requests to Github's API and Gemini to fetch content live. We also spent a lot of time trying to use voice APIs. Although voice commands would have been an intuitive feature for an AR-based tool, the development environment for our hardware didn’t support sending voice data or raw audio over network traffic. This lack of support prevented us from incorporating voice-based inputs, forcing us to stick to purely gesture-based controls. It was a significant design constraint, as integrating voice interactions could have streamlined the user experience. Finally, creating a tree-like data structure using GitHub’s API posed its own complexities. While GitHub offers powerful tools, mapping repositories into a format that could be visualized in AR as a Git tree was no easy task. It involved handling complex

Accomplishments that we're proud of

As it was our first time working with Snapchat Spectacles and Snap AR’s Lens Studio, FastAPI, GitHub API, and Google Gemini, we had a steep learning curve but gained invaluable experience and built out an amazing product. We explored various AR components of the glasses, such as hand tracking and gesture recognition, which enabled us to create an intuitive and interactive user experience. One of our proudest accomplishments is simplifying the merging operation by automating the pull request workflow, addressing one of the more tedious and complex aspects of development. This app not only solves a significant problem but also has the potential to greatly enhance our own development workflows. We’re proud to have brought this innovative solution to life!

What we learned

We gained invaluable experience in developing for 3D environments, specifically in AR human-computer interaction (HCI) design, a first for all of us. We learned how to create intuitive, gesture-based interfaces that seamlessly integrate into the physical world. On the backend, we deepened our understanding of designing APIs for efficient handling of both synchronous and asynchronous requests, ensuring smooth performance even in an AR setting. Our dive into Git internals helped us master the complexities of version control, giving us a new appreciation for how Git operates under the hood.

What's next for git.ar

We hope to continue adding support for other git commands like rebasing and stashing commits in the future. We also would love the opportunity to integrate voice command handling in our product once it can support both voice commands and network requests together! Finally, we would love to get more developers to try our tool and add further support for other types of git tools like bitbucket and gitlab. Eventually we would love to port this application to the consumer ready Snap AR Spectacles for use in training junior engineers as well as new developers with access to AR hardware.

Built With

Share this project:

Updates