Inspiration

Our inspiration for creating NeuroNavigate stems from reading that the number of special education teachers declined during the coronavirus pandemic. We wanted to develop a tool that could help fill this gap and provide support for children with autism, ensuring they can still access valuable resources and practice important skills even when teacher availability is limited.

What it does

Our app provides students with autism a platform to practice conversation skills and emotional recognition. Through interactive exercises and feedback, users can enhance their social abilities, even when direct teacher guidance is scarce. This tool aims to empower students, offering them a supportive environment to develop essential life skills independently.

How we built it

The development of NeuroNavigate involved integrating advanced technologies including Python, Flask, OpenCV, and Mediapipe for facial and eye tracking, Whisper for real-time speech transcription, and OpenAI's GPT for simulated conversations. The core emotional recognition capability is powered by a VGG19 neural network model trained on the FER-2013 dataset, adapted for real-time interaction. This model uses weights trained to recognize facial expressions from grayscale images, enabling the app to understand and react to the user's emotional states during gameplay.

Challenges we ran into

We encountered several challenges, including the complexity of accurately tracking eye movements and facial expressions in varied lighting and dynamic conditions. Ensuring the app's interfaces were intuitive and engaging for children also required extensive testing and iteration. Furthermore, managing data privacy and security, given the sensitive nature of the information processed, was a critical area of focus.

Accomplishments that we're proud of

We are particularly proud of the app's ability to dynamically adapt to each child's interaction, providing a highly personalized learning experience. The positive feedback from initial trials with educators and parents has been incredibly rewarding. Moreover, our success in integrating complex technologies to work seamlessly in real-time applications has set a new standard for what we can achieve in educational technology for special needs.

What we learned

This project deepened our understanding of both the technological and psychological aspects of designing educational tools for children with autism. We gained insights into the importance of user-centered design and the effectiveness of integrating real-time feedback mechanisms. Additionally, the process highlighted the value of interdisciplinary collaboration, bringing together fields such as AI, machine learning, psychology, and education.

What's next for NeuroNavigate

Our next steps involve applying to Y Combinator to gain access to critical resources, mentorship, and funding opportunities. We aim to refine NeuroNavigate through further pilot studies and expand its reach into new markets by adapting the app for different languages and cultural contexts. Our long-term vision is to explore applications of this technology for adults with autism and other communication challenges, ultimately aiming to make a global impact.

Built With

Share this project:

Updates