Inspiration
Gen Z and Gen Alpha have built their own digital dialect, a living language shaped by memes, trends, and culture. But with every new term like “skibidi toilet” or “tungtungtung sahur” , understanding each other becomes harder and harder, especially across generations and subcultures.
Our Solution
To address the growing communication problem, we built a translation messaging forum that bridges generational and cultural slang gaps! Our platform allows users to communicate naturally by typing in their own style, slang, or dialect, and automatically translating their message into language the other person understands. Rather than just translate, our program actually interprets your writing and communication style you're using through customization (i.e. region, age). By implementing context-aware translations, our system improves message clarity across generations. As a result, it promotes inclusive digital communication, enabling smoother collaboration in education, workplaces, and online communities, offering opportunities for connection, accessibility, and understanding.
How we built it
Architecture & API: Two-tier stack with a Node.js/Express backend and a React Vite front end. The API exposes /api/translate, /api/users/upsert, /api/messages, plus /api/getthreads and /api/createcomments for the forum, all validated with Zod and configured via .env/CORS. Authentication runs through Auth0; JWTs are verified server-side before upserting Mongo profiles (uid, displayName, generation, regionPref).
Detection → Translation Pipeline: Incoming text is normalized (abbrev + emoji expansion) and passed through a fast rule-based detector against a seeded slang corpus (5k+ entries sourced from Reddit, Twitter, YouTube, Discord). We then call Gemini 2.5‑flash with retrieval-augmented hints (top meanings, regions, few-shot examples). Strict JSON parsing, a ~1.6 s timeout, and a deterministic fallback composer guarantee instant, reliable rewrites. Safety scans and an in-memory cache keep latency minimal.
Data & Models: MongoDB Atlas + Mongoose with collections for Users, Messages, SlangEntry. A C++ training module ingests multi-source corpora, computes PMI-weighted stats, clustering, and quality scores, and emits updated datasets for promotion. Unique indexes on (phrase, region) prevent duplicates; a seed script bootstraps initial entries (e.g., regex tung…tung…sahur → hello everyone).
LLM Strategy: Retrieval-augmented generation feeds Gemini only the relevant hints; no fine-tuning needed for rapid iteration. Backend schema-checks every response (plain, audienceRewrite, detected, notes, safety) before merging with local rules.
Frontend UX: React/TypeScript SPA with Auth0 login, forums, and instant translation previews; comments update optimistically while reuse cached translations.
Tooling & Ops: Nodemon for hot reloads, curl + jq bulk testers, .env configuration, request logging, and strict rate-friendly input validation keep the API stable. An incremental training pipeline (npm run collect:data, slang_trainer) orchestrates ingestion → C++ analytics → deployable artifacts.
Challenges we ran into
One of our biggest challenges was integrating Auth0 authentication with MongoDB. As it was our first time using Auth0, we relied on the built-in connectors and documentation available to utilize it, however, most were outdated and incompatible with our current stack, making it difficult to directly synchronize authenticated user data with our database. To address this, we developed a workaround that linked authenticated users to database entries through a more flexible mapping method, rather than relying on the legacy automatic integration. This required rethinking how user data was stored and accessed, ensuring that authentication remained secure and consistent across our backend routes.
Accomplishments that we're proud of
We're proud of our sleek and minimalistic frontend design, as well as all the animation features we've added! Also, on our stack, we utilized MongoDB, Auth0 and React to make it scalable and robust, and created responsive designs which makes it look aesthetic across all platforms. Moreover, we were able to integrate every feature we originally planned within our timeframe, from an authentication system, interpretation function, and forum interaction.
What we learned
Building Slanguage taught us a lot about both the technical and human sides of creating a platform. We gained hands-on experience integrating authentication systems like Auth0 with a database, coordinating frontend and backend features, and designing for scalability and maintainability. We learned to balance complex backend logic with intuitive, user-focused design showed us that creating a platform that is both robust and inclusive requires careful planning, creativity, and iteration.
What's next for Slanguage
In the future, we plan to expand Slanguage with more customization options, allowing users to tailor translations to even more regions, cultures, socio-economic status, and more. In addition, we'd love to implement a feature which allows our system to auto detect customization options based on the user profile. This system would grow over time, becoming more developed to the user's writing ticks and more understandable for the user themselves.
Built With
- cors
- express.js
- gemini
- geminiapi
- genai
- git
- javascript
- mongodb
- mongoose
- node.js
- react
- tailwind

Log in or sign up for Devpost to join the conversation.