Inspiration
We wanted to create an assistive technology that would allow students to focus on the blackboard during lecture while simultaneously seeing their notes in their field of vision, so they could take notes without ever looking away. The idea came from the frustration of constantly shifting attention between the front of the room and the paper in front of you. What if we can instead merge these two visual fields into one and power up your lecture experience with "DuoVision"?
What It Does
DuoVision is a pair of “smart glasses” that uses:
A two-axis mirror system driven by servos to track and reflect an image feed (projector/lecture slides) into the user’s line of sight.
A camera (connected via a TPU) that streams a live view of the lecture content.
A computer vision model that uses OpenCV instance segmentation to determine the position of the user’s notepad
A custom controller to adjust mirror positions to ensure the notepad remains visible as the user shifts their gaze or moves the paper.
This way, students can see the professor’s lecture content and their own notes simultaneously.
How We Built It
Mirror Mechanics:
We custom-fabricated two small mirrors, cut to the shape needed for optimal reflection into the user’s field of view.
Each mirror is mounted on a servo axis (one horizontal, one vertical). When the camera/computer vision system detects a shift in the notepad’s position, the servos pivot the mirrors to maintain alignment with the user’s gaze.
Camera and TPU Integration:
An ESP32 camera initially seemed like a straightforward choice for capturing the live feed. We wanted to stream the camera feed to a mobile application so the lecture slides or professor’s notes would be overlaid onto the user’s field of view. We integrated a TPU to handle on-device image processing more efficiently, reducing latency and improving the real-time mirror adjustments.
Software & Connectivity:
We started by trying to connect our ESP32 camera feed via Bluetooth Low Energy (BLE) in an Expo React Native app. However, Expo’s build environment had limited support for certain React Native BLE libraries and dependencies. We also faced difficulty integrating the cutting-edge Bluetooth Web API. We pivoted to alternative solutions, including exploring Streamlit Bluetooth APIs, which gave us broader support for both classic SSP Bluetooth and the new BLE stack. Ultimately, we used a double localhost setup: one local server for the desktop environment and another for mobile views. This allowed the glasses to connect wirelessly for real-time video streaming and control commands, while a secondary server provided a user-facing dashboard and additional controls.
Computer Vision
Using Sobel edge detection, we track the boundaries of the notepad as the user moves their hands or changes angles. This data informs the servo motors to rotate the mirrors accordingly, ensuring that the writing area is always visible within the user’s augmented view.
Challenges
BLE Integration with Expo:
We spent significant time trying to get the ESP32 camera’s BLE connection working on both iOS and Android. The Expo build process had issues installing some native modules required for BLE, leading us to pivot and search for new approaches.
Limited React Native Library Support:
We discovered that many advanced React Native Bluetooth libraries weren’t fully supported in Expo, which restricted our ability to directly connect the camera feed to our mobile app.
Multiple Pivots in Bluetooth Solutions:
We tried going from the cutting-edge Bluetooth Web API to Streamlit Bluetooth to see if we could unify classic and BLE connections, all while juggling different platforms (iOS, Android, web).
Mirror Fabrication:
Achieving the right field of view involved several iterations and the use of glass cutters to shape the mirrors precisely. Even a small error in shape or angle would distort the user’s view.
Accomplishments
Successfully built a two-axis servo mirror system that dynamically adjusts to the user’s line of sight.
Managed to stream camera footage via local servers, enabling near real-time updates.
Implemented Sobel edge detection to track the position of the notepad, ensuring a seamless viewing experience.
Learned the constraints of Expo and React Native libraries firsthand, pivoting to workable solutions under tight time constraints.
What We Learned
BLE on Mobile: Integrating BLE in a cross-platform environment (especially with Expo) can be tricky. Native modules are often required, and Expo’s managed environment can limit direct access to those modules.
Importance of Prototyping Hardware Early: Fabrication of physical components (mirrors and servo mounts) needs to happen early to allow enough time for iteration.
Multiple Development Environments: Using a double localhost approach taught us how to separate functionality between front-end and back-end services effectively, especially when bridging hardware and software constraints.
What’s Next
Refining the Mirror Mechanics: We plan on adding feedback loops for more precise mirror positioning and better calibration.
Improving Battery Life: We want to optimize servo use to reduce power draw and potentially add a lightweight battery pack for fully wireless operation.
Enhanced CV Features: We hope to integrate text detection for real-time OCR, so students can interact with digital text overlays.
Conclusion
DuoVision merges optical engineering, computer vision, and software integration into a portable, user-friendly AR system. By tackling challenges in Bluetooth connectivity, mirror fabrication, and real-time streaming, we have opened the door for innovative ways to view and interact with digital information—in this case, letting students effortlessly see lecture notes and their own notepads at the same time.

Log in or sign up for Devpost to join the conversation.