Please Take a look at the above video for a simple demonstration of the app. It's better to view it on full screen. The video is about me trying to figure out why my for loop in C#, which needs to print 0-4, prints out 5 as well. Using the help of the duck, I can figure out the solution conversationally while thinking and fix it in the process.
Inspiration
If you were a Student learning the fundamentals of code, you might find using LLM's like ChatGPT, Copilot etc. to be useful. But this might hamper your learning capability to learning fundamental things, as one such study found out that relying more on LLM's led to increased levels of procrastination and forgetfulness, and negatively affecting academic performance (Abbas et al., 2024). Furthermore, they call on reducing the over-reliance of using those services and encourage to finding a middle ground between technological assistance and learning
Before LLM services hit the mainstream, some people used to rely on this framework for problem-solving and even debugging code; called Rubberducking in which you can explain your code or problem to some inanimate object. This is so that you can try and break down what it is you are trying to accomplish with the code that has got you stuck in the first place. By doing so, you are being forced to get out of your own head, explain what is going on with your code and think possible perspectives to solve the problem. Very often, by rubber ducking a problem, a student can run into a solution without having to do any Googling whatsoever.
But to someone new, this framework can be very challenging and confusing about what the process of rubberducking would look like.
Solution
My app, DuckSight, tries to improve on the above by focusing on the conversational problem-solving paradigm and doesn't try to give the solution outright, helps with any coding language, mitigates the use of any other distracting interfaces (besides the mixed-reality rubber duck :) ), focusing on the learning task at hand instead of being distracted by something else, and provides a solution to any student populace with a mixed-reality headset such as the Quest, where the Presence Platform SDKs seamlessly make it possible to solve code problems IRL with a virtual help interface.
How to use it
Using DuckSight, you can talk to the virtual rubber duck by clicking the mic button or waving your hand by fully extending your hand, speak out your problem with intent (think about the problem before you speak), and the duck tries to give you a perspective to think about on where the solution might be. Do this back and forth, and you might find a full solution.
But be aware of the HP, as one message from the duck costs 1 HP and resets 24 hours later if HP is 0 to avoid over-reliance (also, if you have to still solve the same problem after 50 turns, then maybe it was worth getting the duck's insight around the 10th turn and then reverting back to the duck later after some self-reflection).
If you want a new problem to be asked, click on the reset button and wait for 5 secs until the message clears, and then ask the problem
You can open/close the panel if you wish, if you want to take a look at other screens, and a L/R switch button which switches the buttons left/right of the panel.
How I built it
I built the app using Unity, where I used the powerful Presence Platform SDK to build this experience using Passthrough and Scene MR Capabilities with Interaction and Voice SDKs. I also used the com.openai.unity package by Rage Against The Pixel(under MIT license) to interface with openAI APIs and Unity. Duck Asset through Snowconesolid Productions, free on the Unity Asset Store for anyone to use in their projects.
Challenges I ran into
One of my challenges came with using the Meta Voice SDK with Wit.AI. I implemented the Voice dictation SDK, where the transcription accuracy is not perfect, but the AI makes up for it by asking if that's what you meant or understands what you meant to say in some cases.
Tried to implement the Mixed Reality Utility Kit, to find the spawn position of the duck, to be spawned on a table, but the whole app stutters for some reason. Wasn't able to get a fix in time, so I spawn the duck as if it floats in the original position on where the app first opens.
Buttons are mostly responsive, but do not rapidly press a button, instead press it slowly.
Accomplishments that I'm proud of
- Build a full app using Meta's Presence Platform, learned a lot
- Integrated 3 independent systems to work as intended (mostly)
What's next for DuckSight
- Want to implement Depth API to add realism
- Implement a table spawn position using improved means
- Passthrough doesn't capture camera video, but someday try to make the duck see what I'm looking at instead of me explaining the whole problem fully.
!!!Before installing APK!!!
Please make sure to install the apk with Meta Quest Developer Hub, as it fully sets up the necessary permissions needed for the app. (RECOMMENDED)
If you want to load it through other means, load the apk on the headset. Before starting the app, make sure the necessary permissions (Microphone, Spatial Data) are on first by going to Settings > Apps > Installed Apps > DuckSight
Tested on the Quest 3


Log in or sign up for Devpost to join the conversation.