Inspiration

Navigating city transit often feels like a guessing game. Between "ghost buses," outdated service alerts, and the constant friction of the TTC’s unpredictable scheduling, we realized that commuters need more than just a map; they need clarity and a reason to stay engaged. We wanted to turn the frustration of waiting into a rewarding, data-driven experience.

What it does

Transit Lens is a real-time navigation and prediction platform tailored for the TTC (starting with the 501 Queen and 503 King streetcars). It provides:

Live Outage Updates: Harnessing the TTC API and the Gemini API in an RAG pipeline to keep the user informed of outages.

Turn-by-turn navigation: Phone's dead? No worries - find a Transit Lens kiosk and you're good to go!

Intelligent Narratives: AI-powered audio and visual updates that describe station conditions and service changes in plain English.

How we built it

The build process was an evolution of scaling functionality and accessibility:

Web Foundation: We started by building a robust web application for desktops using React, Next.js, and Python/Flask, focusing on real-time data ingestion from the NextBus XML API.

Mobile Optimization: Once the core logic was stable, we refactored the UI to be fully mobile-responsive, ensuring commuters could access the "Lens" on the go.

Voice Integration: To make the application more accessible, we integrated ElevenLabs to convert live transit data and "Transit Lens" updates into natural-sounding speech, giving the app a "human" guide for commuters.

Hardware Expansion: We made a version that works on the Raspberry Pi, to show that any computer, however small or weak, can run it with ease, making kiosk deployment elementary.

How it works:

The map itself is rendered using leaflet.js, while real-life locations in the search come from Nominatim, OpenStreetMap's search engine. The subway stations are hardcoded in, while the streetcar data comes from the NextBus API. When you press "Generate", the app uses Dijkstra's algorithm to find the most optimal route with the TTC, and uses the foot-routing version of OSRM, an open-source turn-by-turn navigation API, to handle walking sections. Default transfer messages and OSRM instructions are stitched together to provide a cohesive set of instruction to get to your destination. Then, the "Relative Directions" options utilises the Gemini API as well as the Google Maps API to make the instructions more readable and easier to understand, with landmarks and simple instructions leading the way. The user even gets the option to get a printed copy of these instructions (we've repurposed a reciept printer for this purpose).

A simpler version of the same app is repurposed for Raspberry Pi systems to be used in kiosks.

Brand Identity: To tie the ecosystem together, we developed the Transit Lens brand, including a custom logo and a cohesive visual language designed to look professional and trustworthy.

Physical Output: Finally, we integrated a printer into the Pi setup, writing custom scripts to generate and print navigation plans for "offline" transit assistance.

Challenges we ran into

Getting the printer to work was a massive headache - it wouldn't connect to a laptop under anything other than the perfect conditions, and even then it wouldn't work consistently.

Accomplishments that we're proud of

We are incredibly proud of the end-to-end evolution of the project. We didn't just stop at a digital dashboard. We pushed the boundaries of what a hackathon project can be by achieving several major milestones:

Platform Versatility: We successfully transitioned from a desktop-first web app to a mobile-optimized experience, and finally to a functional Raspberry Pi hardware node.

The Physical Bridge: Designing and troubleshooting a custom printing system that generates physical navigation assets was a massive win. Overcoming the buffer memory issues to get physical instructions into a user’s hand felt like a true "engineering" moment.

Multimodal Accessibility: Integrating ElevenLabs to provide high-fidelity audio narration changed the project from a simple map into an inclusive tool for the visually impaired or those who need eyes-free navigation.

Brand Cohesion: We are proud of the Transit Lens identity. Creating a professional logo and a unified visual language made the project feel less like a prototype and more like a real-world startup.

What we learned

We learned a ton about the nuances of transit APIs and the complexities of real-time state management. Beyond the code, we realized how much "dead time" exists in a daily commute and how much potential there is to gamify public infrastructure data to make it more transparent.

What's next for Transit Lens

Our goal is to one day actually partner with the TTC to bring a Transit Lens kiosk to a location near you.

Built With

Share this project:

Updates