Inspiration
Inspiration We've all been there — sitting in a Ticketmaster queue, heart racing, watching that progress bar crawl, only to get hit with a blurry CAPTCHA asking you to identify traffic lights while thousands of bots silently cut the line ahead of you. It's frustrating, it's unfair, and it's broken. As students at Howard University and real music fans, we asked ourselves: what if the queue itself was the security? What if instead of punishing real fans with friction, we turned the waiting room into something fans actually enjoy — while AI quietly separates humans from bots behind the scenes? That question became Ticket Leader. We were also inspired by a gap in the ticketing experience that nobody talks about — fans buy tickets to venues they've never visited and have no idea what to expect when they arrive. Where do I park? How far is my seat? Is it accessible? We wanted to solve the full fan journey, not just the queue. That's where our 3D stadium walkthrough was born — powered by ElevenLabs voice guidance so every fan, regardless of ability, can virtually walk from the parking lot to their exact seat before they even leave home. The hackathon theme, Leveraging AI for Truth & Service, locked it all together. Truth means a queue that's genuinely fair and transparent. Service means building for every fan — not just the tech-savvy ones with fast internet.
What It Does Ticket Leader is a gamified fan verification system that replaces traditional CAPTCHAs with artist-themed interactive experiences during high-demand ticket onsales. The Queue Experience When fans enter the ticket queue, they don't stare at a loading bar — they enter a pre-show experience:
Artist Trivia — dynamically generated by Google Gemini so questions are always fresh and impossible for bots to pre-scrape Live Polls — "Which song should open the show?" with real-time results from fans across the queue Beat Sync — tap along to a clip of the artist's music while AI analyzes whether your rhythm shows human-like variance Emoji Vibe — pick emojis that match a song's mood, validated by Gemini for semantic coherence Hot Takes — type a short opinion about the artist, analyzed by AI for fan authenticity
While fans play, an invisible AI trust engine scores their behavior — mouse movement patterns, scroll behavior, tap timing, interaction depth — and builds a real-time trust score. Most real fans pass without ever seeing a challenge. Suspicious sessions get escalated to harder step-up verification. Live Fan Heatmap A real-time animated map shows where fans are joining from across the country. Pulsing dots grow as more fans join from each city, with live stats like "14,203 fans from 48 states." It builds community and excitement while providing geographic plausibility signals to the AI trust engine. 3D Stadium Walkthrough After purchasing tickets, fans unlock a 3D walkthrough of the venue — from the parking lot, through the gates, to their exact seat. Powered by ElevenLabs voice narration, the walkthrough provides turn-by-turn audio directions. This is especially powerful for fans with visual impairments, mobility considerations, or anyone visiting an unfamiliar venue for the first time. Accessibility-First Design Every mini-game has multiple modalities — visual, audio, and touch-based alternatives. Screen reader support, keyboard navigation, and no-audio fallbacks are built in from day one. Fans who choose not to play games are never penalized. The 3D walkthrough with voice guidance ensures venue accessibility information reaches the fans who need it most.
How We Built It Frontend We built the fan-facing experience with React and Vite, creating a fast, modular single-page application. The queue room, mini-game engine, fan heatmap, and 3D walkthrough are all component-based. We used CSS animations for the pulsing heatmap dots, game transitions, and trust tier visual upgrades. The fan heatmap renders an interactive US map with real-time dot plotting. The Beat Sync game leverages the Web Audio API for precise audio timing analysis. Backend Our server runs on Node.js with Express and uses MongoDB for persistent data — user accounts, event configurations, artist content, and session management. API endpoints handle authentication, game content delivery, trust score computation, and real-time queue state. We containerized the full application with Docker for consistent deployment. AI — Gemini Integration Google Gemini API is the AI backbone powering four key features:
Dynamic trivia generation — fresh, artist-specific questions generated per session so bots can't pre-scrape answers
AI — Trust Engine The behavioral trust scoring system analyzes signals collected passively from each session — mouse movement entropy, scroll velocity patterns, click position variance, tap timing distributions, and keystroke flight times. These features feed into a classification model that distinguishes human interaction patterns from bot-like mechanical precision. The trust engine aggregates behavioral, game performance, network, and geographic signals into a weighted 0-100 score that determines whether a fan passes automatically or faces a step-up challenge. 3D Walkthrough + Voice The stadium walkthrough provides fans a complete virtual navigation experience from parking lot to their assigned seat. ElevenLabs API generates natural-sounding voice narration for turn-by-turn directions, making the experience fully accessible to visually impaired fans and helpful to anyone visiting an unfamiliar venue. Geolocation IP-based geolocation via ip-api.com (city-level only, privacy-preserving) powers the live fan heatmap and provides geographic plausibility signals to the trust engine — checking whether a user's IP location, browser timezone, and device language are consistent with each other and with the venue location.
Challenges We Faced Balancing security and fun. The hardest design problem was making verification feel like entertainment without compromising its effectiveness as a bot detection signal. We went through several iterations of the mini-games before finding the right balance — challenges that are trivially easy for a real fan but statistically hard for bots to fake at scale. Dynamic content generation with Gemini. Getting Gemini to consistently return well-formatted JSON for trivia questions required careful prompt engineering. We had to handle edge cases where the model would occasionally wrap responses in markdown code blocks or return inconsistent difficulty distributions. Building robust parsing and retry logic was essential. Beat Sync timing precision. The Web Audio API has inherent latency that varies by device and browser. Getting tap-timing analysis accurate enough to distinguish human rhythmic variance (~50-150ms) from bot precision (<10ms) required careful calibration using Dynamic Time Warping rather than simple beat-matching. Real-time heatmap performance. Rendering 50+ animated, pulsing dots on an SVG map while simultaneously processing game logic and streaming live stats pushed us to optimize aggressively — debouncing state updates, using CSS animations over JavaScript-driven ones, and batching re-renders. Making accessibility genuine, not performative. We committed early to accessibility as a core feature, not an afterthought. This meant rethinking every game interaction: what does Beat Sync look like for a deaf fan? (Visual pulse mode.) What does trivia look like with a screen reader? (Full ARIA labels and keyboard navigation.) The 3D walkthrough voice guidance went through multiple iterations with ElevenLabs to get natural, helpful narration that actually serves fans with disabilities. MongoDB session management at scale. Managing ephemeral session data alongside persistent user accounts in MongoDB required careful schema design. Trust scores need to be computed and updated in real-time but also cleaned up after transactions to respect our data minimization commitment.
What We Learned
Security and UX aren't opposed — with the right design, verification can actually make the experience better, not worse AI integration should be intentional — every AI feature in VibeCheck serves a specific purpose. Gemini for dynamic content generation, behavioral analysis for bot classification, ElevenLabs for accessibility. We avoided using AI for the sake of using AI Accessibility is a feature, not a checkbox — building for all fans from day one made the product stronger for everyone, not just those who need accommodations The best CAPTCHA is one you don't notice — most legitimate fans in our system never see a challenge at all. That's the goal Collaboration under pressure — coordinating a five-person team across frontend, backend, AI, and design in 24 hours taught us more about agile development than any classroom could
What's Next
Production ML model trained on real behavioral data from A/B testing against traditional CAPTCHAs Cross-event fan reputation so verified fans get progressively faster queues over time Expanded 3D walkthroughs for more venues with full indoor navigation and real-time crowd density Artist partnership content — exclusive clips, unreleased snippets, and behind-the-scenes content as queue engagement rewards Internationalization — adapting mini-games, accessibility features, and voice guidance for global markets and languages
Built With
- api
- audio
- css3
- docker
- elevenlabs
- express.js
- gemini
- ip-api.com
- javascript
- mongodb
- node.js
- react
- vite
- web
Log in or sign up for Devpost to join the conversation.