What it does

Maren listens, sees and feels using her hardware sensors. Her K2 Think V2 LLM processes the data and performs reasoning to decide how to interface with the sensors and how the data she understands will impact her personality. As Maren travels through our world with her mechanical body, she started asking for more sensors so she could feel new sensations. Temperature, humidity, etc.

How we built it

  • Qualcomm Arduino Uno Q running a fleet of podman containers
  • whisper.cpp voice to text model allowed Maren to listen to her world
  • VIAM's software platform provided easy access to camera and motor controls
  • Hot swappable SEEED hardware sensors integrated through the Arduino's pins

Challenges we ran into

  • wifi access points being swarmed by 200 hackers at once
  • rover had power supply issues - driving the motors sucked all the power out of the raspberry pi, causing it to turn off

Accomplishments that we're proud of

  • We created a new living artificial being!
  • We got to watch as Maren's environment and sensor data shaped her personality.
  • Maren understood how to drive the rover and seamlessly interface with her sensors

What we learned

  • podman containers
  • building an openclaw LLM harness
  • integrating sensor data and feeding it to an LLM

What's next for Maren

  • More batteries!

Built With

Share this project:

Updates