Inspiration
Our teammate Daniel Teal has an amazing idea: an autonomous cart. We think that connected and autonomous devices are going to be an important part of the future, so we wanted to prototype this idea. We call it Phasebot.
What it does
While other autonomous bots work on vision sensing, we decided to use sound to find our way through the busy shopping aisles of crowds of moving passengers and trolleys at the airport. What Phasebot does is take the phase difference between sensors after receiving a signal from a source (say, your phone). Phasebot will use that phase distance, as well as other information encoded in the signal, to find your position and that will allow Phasebot to follow you around.
The real benefit of this project is the technology: we combined TDOA (Time Direction of Arrival) sensing with FSK (frequency shift keying) for an acoustic system that tracks both distance and direction of a sound transmitter-say, a phone-for the first time, to our knowledge. This can guide robots, or, with cheap microphones around a store, track customer phones to detect popular products and congested aisles.
How we built it
We built our prototype around a robot chassis and a sensor system - using the RedBot Arduino to create a small base system that moves on command, we designed a configuration to use sensors that can be placed on the robot and used in conjunction with a distant audio source to determine where the position of the source is.
Challenges we ran into
We spent a lot of time trying to debug issues that came with the hardware - integrating sensors and data between boards such as the DragonBoard 410C, the Raspberry Pi Zero, and the Redbot Arduino is hard, so we ended up seeking ways to reduce the ways we needed to operate the controllers and learned about alternative ways to build the robot system.
Accomplishments that we're proud of
We're proud of what we've learned and what we've built during the 24 hours of HackTX, from signal processing strategies to how different controllers implement serial communication. Our physical end results show that we've done and worked on something substantial, and that's what counts to us.
What we learned
We learned that things don't go the way they are supposed to be. Sensors aren't going to be outputting optimal results, and sometimes the board won't be able to run code correctly. But we also learned how to superposition signals to get more accurate tracking, as well as a better understanding of how the python-arduino integration works.
What's next for PhaseBot
PhaseBot is going to be built upon and optimized for real life testing. It's going to be a project where eventually PhaseBot will be able to navigate large obstacles and move with precision.

Log in or sign up for Devpost to join the conversation.