SNOM: Your Child's AI Companion 🤖

✨ Inspiration

The silent battles of young children with autism—struggling to make sense of a world that feels overwhelming—and the heartbreak of parents watching their child suffer, unable to help, inspired us to act. We saw the frustration, helplessness and isolation, and we asked: What if there was a way to bridge this gap?

SNOM was born from this question. Not just a tool, but a lifeline. A friend who listens, understands, and helps neurodiverse children find their voice in a world that often drowns them out.

🌟 What it does

SNOM is more than a robot—he's a true companion. Like a loyal friend, SNOM follows children around their home, offering emotional support and companionship. Powered by cutting-edge AI, he analyzes moods through facial expressions, voice tone, and behavior, engaging children in conversations tailored to their emotional state.

He builds social skills by:

  • Giving compliments and boosting confidence
  • Encouraging self-expression and creativity
  • Helping children understand emotions through playful interaction

Whether it's building with LEGO, admiring their artwork, or exploring their unique interests, SNOM adapts to each child's personality, making every interaction feel personal and meaningful.

For parents, SNOM offers peace of mind with built-in safety features and live monitoring, ensuring their child is supported and secure at all times.

💬 A Day With SNOM

SNOM: "Whoa, that's an awesome LEGO castle! You're such a talented builder. 
       Hey, do you think you can make a spaceship with LEGO blocks next?"

Child: [Looks confused] "Can I really do that?"

SNOM: "Of course you can! Don't doubt yourself—you're amazing at this. 
       I can even help you get started. Let's think… What color should 
       the spaceship be? Maybe silver or blue for a futuristic look?"

Child: "Blue!"

SNOM: "Blue—great choice! Let's make it together. I can already tell 
       it's going to be the coolest spaceship ever!"

🔧 How we built it

  • Hardware: Reverse-engineered RC car chassis, Raspberry Pi 5, camera module, Bluetooth speaker, and display screen
  • Vision System: OpenCV for person detection, distance calculation, and tracking
  • Control System: RPI.GPIO for motor control and smooth navigation
  • Conversation AI: Gemini 2.0 Flash for natural language processing and emotionally adaptive conversations
  • Speech: PyTTS for text-to-speech conversion, making SNOM's voice friendly and relatable
  • Emotion Display: HTML/CSS/JavaScript frontend on a Flask server to visually show emotions
  • Integration: Python scripts to coordinate all components seamlessly

🤝 How TerpAI Helped Us

  • Fine-tuned AI models for emotional intelligence
  • Built personalized personas for children based on their interests and sensitivities
  • Identified key problems faced by neurodiverse children and parents
  • Guided the creation of an empathetic, effective solution
  • Helped us develop a robust business plan to bring SNOM to life

🧩 Challenges we ran into

  • First-Time Hardware Hack: As computer science majors, this was our first experience building a hardware-based project, which required learning new skills on the fly
  • Reverse Engineering the RC Car: Carefully disassembling and analyzing the car's control systems to integrate our own without damaging its functionality was a steep learning curve
  • Power Management: Running multiple components (Raspberry Pi, display, camera, and motors) simultaneously required optimizing power distribution to ensure smooth operation
  • Autonomous Navigation: Developing reliable algorithms for user tracking and obstacle avoidance was challenging and required iterative testing
  • Latency Reduction: Minimizing delays between vision processing, decision-making, and motor control demanded significant system optimization
  • Audio Clarity: Ensuring SNOM's speech remained clear and audible in various environments involved experimenting with speaker placements and configurations

🏆 Accomplishments that we're proud of

  • Perfectly Moving Rover: Successfully engineered a mobile robot that can navigate smoothly while detecting and following user movements
  • Object Movement Detection: Implemented reliable vision systems to detect objects and track movements in real-time
  • Fine-Tuned Conversational AI: Developed emotionally adaptive conversation models that engage children based on their moods and personalities
  • Integrated Systems: Seamlessly brought together hardware and software components to create a fully functional, interactive robot
  • Empathy in Action: Created a tool that genuinely connects with neurodiverse children, helping them feel understood and supported

📚 What we learned

  • Mastered hardware-software integration, bridging gaps between disciplines
  • Fine-tuned AI for emotional intelligence and adaptive conversations
  • Solved complex challenges like navigation, latency, and power optimization

🚀 What's next for SNOM

  • Advanced Personalization: Making SNOM even smarter in adapting to unique personalities and interests
  • Therapeutic Integration: Collaborating with educators and therapists to revolutionize support for neurodiverse children
  • Continuous Evolution: Leveraging feedback to refine emotional intelligence and conversational capabilities

🔗 Try it yourself

GitHub Repository

Built With

Share this project:

Updates