Inspiration
Humans rely almost entirely on vision to navigate the world,but vision isn’t always reliable. Darkness, fog, crowded spaces, and low‑visibility environments can instantly reduce spatial awareness. With over 2.2 billion people experiencing some form of vision impairment, we were inspired to explore how expanding human perception could increase safety, independence, and confidence. EchoSense began as a speculative tool imagining what humans could do if we could see with sound.
What it does
EchoSense replicates echolocation, the sensory system used by bats, dolphins, and other animals. The system emits sound waves, detects how they bounce off nearby objects, and translates the returning signals into a visual spatial map. Users gain a new sensory layer that helps them perceive distance, shape, and movement even when vision is limited. The app also includes educational infographics explaining the science behind echolocation and how different animals use it in unique ways.
How we built it
We began by researching how different animals use echolocation, studying the way bats, dolphins, and other species emit sound waves and interpret returning echoes. From there, we explored how this biological process could be translated into a visual interface for humans, mapping sound waves into clear, intuitive spatial cues. We designed UI elements that communicate distance, shape, and object placement without overwhelming the user, focusing on clarity and accessibility. Alongside the interface, we created educational infographics that explain the science behind echolocation and highlight how each animal performs it differently. Finally, we built a narrative imagining how humans could adopt a sixth sense through technology, grounding our speculative concept in real scientific principles.
Challenges we ran into
Designing visuals that represent sound‑based spatial data without overwhelming users was one of our biggest challenges. We wanted EchoSense to feel intuitivem so we focused heavily on usability. That meant thinking through where the user would experience the least inconvenience, how much information should appear on screen at once, and how to present it without causing sensory overload. We also explored practical safety considerations, such as how many emergency‑contact buttons should be included, where they should be placed, and how easily they could be accessed in stressful situations.
Accomplishments that we're proud of
We’re honestly most proud of the theme we decided to build around , this whole playful idea of giving humans a sixth sense. It made the project feel fun and imaginative, and it let us lean into that connection between humans and animals in a way that feels almost like giving people a tiny superpower. It goes beyond human sensory experience, which made the whole concept way more exciting to design.
We also had a lot of fun researching how animals actually use echolocation. Some of the things we learned genuinely surprised us, and that’s what pushed us to include an educational section in the app. We figured if we were fascinated by it, users would be too. So being able to mix science, creativity, and accessibility into one project is something we’re really proud of.
What we learned
We learned a lot about how different animals use echolocation in their own ways. Bats rely on high‑frequency sounds to navigate and hunt in total darkness, using the echoes to figure out distance, movement, and shape. Shrews, even with their poor eyesight, use ultrasonic calls underground to track insects. Dolphins send sound waves through their melon, and the returning echoes travel through their lower jaw to the inner ear, letting them build a mental image of what’s around them. And toothed whales fire off rapid clicks that bounce through the water, helping them detect an object’s size, distance, and shape with surprising accuracy. Seeing how each species evolved its own version of this ability helped us understand the core science behind EchoSense and inspired how we imagined humans using a similar “sixth sense.”
What's next for Echo Sense
Two features we think would be great additions to EchoSense are environmental profiles and path guidance for low‑visibility situations. Environmental profiles would let the app automatically adjust its sensitivity based on where the user is: like night walking, fog, crowded streets, or indoor spaces,so the experience always feels comfortable and not overwhelming. Path guidance would take things a step further by not only detecting obstacles but also gently showing users the safest route forward using subtle arrows or pulses. Together, these features would make EchoSense feel even more like a real sixth sense, helping people move through the world with more confidence and ease.
Built With
- canvas
- figma
- figmamake
- figmaslides

Log in or sign up for Devpost to join the conversation.