Inspiration

Inspiration for NeuroScent arises from the current limitations of XR systems that are often limited to immersion in the most common sensory domain: vision with a head-mounted display (HMD). While HMDs are wonderful interfaces for visual-based training, productivity, and learning, there's potential to expand beyond into more sense modalities. Sound and haptics are often added in modern standalone HMD systems, but there is more that could be be done: integrating the sense of smell and biosensing. Us humans can perceive more than 1 trillion smells – which alter our mental state. Additionally, biosensors (non-invasive EEG brain-computer interfacing, PPG heart rate monitoring, and EMG muscle movement) provide meaningful indications of current mental state.

To promotes user's mental well-being based on multimodal human sensing, we created an XR biofeedback system that incorporates olfaction (scent) along with the Varjo head-mounted display (vision) and OpenBCI Galea.

We were also inspired by this paper for the olfactory display and its hardware design: https://hal.science/hal-03838757v1/file/Nebula_VRST_2022%20%281%29.pdf

What it does

System with a custom olfactory display that outputs scents to promotes user's mental well-being based on biofeedback.

How we built it

Hardware: There have been previous similar projects that are open source, however, due to the time constraints and limited component selection of the hackathon. we opted to try and make a simple custom design. We used two cheap diffusers from CVS and harvested the ultrasonic atomizers from them to convert essential oils to mist.

Then through some modifications of the provided printed circuit boards in the diffusers, we connected them to our ESP32 with relays (electronically controlled switches) along with several fans to facilitate proper airflow and scent diffusion. Finally, we used Fusion 360 (CAD software) to modify a pre-existing case we found to accommodate our limited hackathon components.

The ESP32 then interfaced with the VR headset through a USB serial connection to Unity.

Software: EEG, PPG, EMG input: Galea GUI Unity Scene: We used shader graph, particle systems, and linked animation scripts to objects to create effects. C#: Unity scripting to control Arduino to release scents based on biofeedback streamed from Galea. It was more complicated than it sounds!

Challenges we ran into

The PC we were given would crash at times, requiring complicated recalibration and ports not working. Also issues with charging the headset-we can't charge it overnight, etc.

Accomplishments that we're proud of

  • We managed to develop rudimentary, yet functional seamless hardware and software that worked closely together. Seeing this all come together in a seamless AR experience was incredibly rewarding. Furthermore, being able to test and demonstrate the software on a cutting-edge device like Galea.
  • Pulling all-nighters together!

What we learned

We learned how to link input biodata from a Galea headset into Unity! Additionally, we learned how to optimize a scene in Unity to have minimal polygons to maximize FPS with techniques like occlusion and having the scene be cut off after a certain viewing distance.

What's next for NeuroScent

  • More scenes in the immersive visual experience to benefit the user for longer spans of time. Subsequently, a wider range of scents and different permutations of combined scents! We will essentially up the complexity for an even better experience.
  • Apply to healthcare; for example, help anesthesia patients calm down in our immersive experience during local anesthesia procedures. This is to save on overall cost!

Built With

Share this project:

Updates