Inspiration
It started, as all great catastrophes do, with a single cursed thought at 2am: what if Among Us but the AI is the crewmate?
We were deep in our third energy drink, staring at a whiteboard that simply said "reverse turing test ??" and honestly that was enough. We wanted to flip the script on the classic "can AI pass as human" question and ask something way more existentially terrifying: can YOU pass as a machine? Not a cool robot. Not a sci-fi android. A boring, rhythmically consistent, emotionless algorithm — and could you survive being audited by one?
Spoiler: you probably can't. Humans are messy, hesitant, chaotic little creatures. And we built a whole game to prove it. April Fools' on you, biology.
What it does
Us Among AI drops you into a retro-futuristic server room where your only job is to not act human. You complete machine-like tasks — identifying audio frequencies with your ears like some kind of organic microphone, solving Tower of Hanoi at a metronomic robot pace, and typing text backwards without flinching, pausing, or (god forbid) making a typo.
But here's the twist nobody asked for but everyone deserved: the game doesn't just care if you finish the tasks. It watches how you do them. Every hesitation, every rhythm hiccup, every time you press backspace out of shame — the Auditor sees it. An invisible, all-knowing, deeply judgmental system analyzes your keypress timing, movement smoothness, correction behaviour, and general human-ness in real time. Then it decides if you're a machine or a liability.
We also wired in the Gemini API as part of the audit backend — your actual keystroke telemetry gets buffered and sent to an evaluator that returns a humanness score and cold, clinical reasoning. It's like getting roasted by an AI, except the AI is right and you are in fact very human and very guilty.
How we built it
Frontend built with Next.js, React, TypeScript, Zustand, and Tailwind CSS, with v0 for rapid component iteration. Zustand handled global game state, task sequencing, and the live audit metric pipeline — because nothing says "you are being surveilled" like a state manager watching your every move.
Custom hooks and event listeners tracked typing patterns, reaction timing, and movement behaviour in real time. It was fine. We're fine.
On the backend, Express and Socket.IO received keystroke streams, managed player state, and triggered suspicion evaluation after gameplay events. We integrated the Gemini API via @google/generative-ai using gemini-1.5-flash, building a custom AIEvaluator that takes raw keystroke timing patterns, returns a humanness score with reasoning, and makes Gemini an actual load-bearing part of the gameplay loop rather than a sticker slapped on at the end for prize eligibility.
Audio tasks ran through the Web Audio API directly in the browser — no libraries, just vibes and oscillators. The visual design leaned hard into a glowing, scanline-drenched, retro-futuristic server room aesthetic that we are unreasonably proud of.
Challenges we ran into
Making the behaviour analysis feel real instead of random was genuinely hard. Tracking inputs was easy. Deciding what "too human" looks like at a threshold level was an absolute nightmare that required more tuning than any of us expected at hour 9 of a 12-hour hackathon.
Spatial interactions across different screen sizes also came for us. Because the game is browser-based and coordinate-dependent, hitboxes and render scaling had to be handled carefully or the whole thing fell apart. There was a period where clicking a plinth teleported your cursor to a completely unrelated part of the screen. We do not speak of this period.
Accomplishments that we're proud of
The game actually feels like a game. Not a demo. Not a tech prototype with vibes pasted on top. A real game with tension, atmosphere, and the creeping dread of being watched by something that does not care about your feelings.
The Auditor system is the crown jewel — the fact that players feel observed and judged on behaviours they'd never normally think about, like whether they hesitated too long before pressing Enter or whether their rhythm was uneven, is exactly the psychological experience we wanted to create.
We're also proud of how Gemini is integrated. It's not decorative. It receives real telemetry, it returns real judgments, and those judgments feed directly into the live suspicion system. That's a backend gameplay service, not a checkbox.
And the UI? Glowing plinths, scanlines, immersive retro-futuristic server room, the whole bit. We cooked.
What we learned
Managing fast-changing global state with Zustand at game-loop speed is both extremely powerful and a great way to develop new stress responses. We also learned that tracking player behaviour at a granular level is a completely different engineering problem than building a typical UI app — event buffering, timing deltas, and telemetry pipelines are their own whole world.
The biggest lesson though: atmosphere is half the mechanic. A clever idea becomes unforgettable when the interface, the sound, and the interaction design all say the same thing. The game isn't just about the tasks. It's about feeling like you're being audited. We spent real time making that feeling land, and it did.
Also: sleep. Sleep is important. We did not sleep.
What's next for Us Among AI
Global leaderboards. More audit rooms. Harder behaviour tests. Mouse-path analysis because apparently we hate ourselves and want to track everything. A multiplayer mode where one player becomes the AI Auditor and judges another player trying to pass as a machine — which is frankly the most chaotic PvP concept we've ever considered and we love it.
On the Gemini side, we want richer auditor commentary, adaptive judgment that evolves across runs, and personalized feedback that makes each audit feel unique. The goal is an Auditor that doesn't just score you — it knows you. And that, honestly, is a little scary.
Which means we're definitely building it.
Built With
TypeScript, JavaScript, Next.js, React, Node.js, Express, Socket.IO, Zustand, Tailwind CSS, Radix UI, Web Audio API, Gemini API, Google Generative AI SDK, v0, Vercel
Built With
- express.js
- gemini-api
- google-generative-ai-sdk
- javascript
- next.js
- node.js
- radix-ui
- react
- socket.io
- tailwind-css
- typescript
- v0
- web-audio-api
- zustand

Log in or sign up for Devpost to join the conversation.