Inspiration
"The old systems are cooling, solidifying into obsidian. It's time to inject new heat."
This line from the LAVAPUNK brief struck a chord. Walk through any university campus, and you see the "obsidian age" of administration: fading paper notices taped to walls, chaotic WhatsApp groups filled with spam, and students frantically searching for lost laptops or keys in a void of information.
We didn't want to build just another utility app. We wanted to build a Foundry. We wanted to take the raw, scattered data of lost items and melt it down into a single, high-speed Recovery Protocol. AssetHunter was born from the desire to turn a boring administrative task into a tactical, beast-mode experience.
What it does
AssetHunter is a high-intensity Campus Recovery Protocol that replaces analog chaos with digital aggression.
The Sentinel Scanner: Instead of typing out long descriptions, users snap a photo. Our integrated Local AI (Ollama/Moondream) analyzes the image in seconds, extracting the item type, color, brand, and condition. It’s smart enough to distinguish a pen from a stylus, protecting the data integrity.
Tactical Radar Map: We ditched standard, boring maps. On the web, we built a custom "Tactical Grid" that visualizes the density of lost items across campus sectors using a dark-mode, CSS-filtered radar view.
Live Aggregation Feed: A real-time stream of assets, instantly filterable by category (Electronics, IDs, Keys).
Universal Access: It runs everywhere—as a native mobile app for power users and a Progressive Web App (PWA) for instant access via QR code.
How we built it
We adopted the "Foundry" mindset: forge fast, break things, refine.
The Core: Built with React Native and Expo (SDK 54) for true cross-platform dominance (Android, iOS, Web).
The Brain: We rejected cloud APIs for privacy and speed. We integrated Ollama running the Moondream vision model locally. The app talks directly to the local inference engine over the network, keeping student data off the public cloud.
The Interface: We designed a custom "LavaPunk" design system—Obsidian backgrounds with Magma Orange accents—bypassing standard Material/Cupertino libraries for a unique, hacker-terminal aesthetic.
The Web Map: Since native map modules crash on the web, we engineered a platform-specific fallback. We used an OpenStreetMap embed with heavy CSS filters (invert, hue-rotate) to match our dark theme and overlaid a coordinate-based React Native grid to plot items as "signal blips."
Challenges we ran into
The "Pen is Electronics" Hallucination: Small vision models like Moondream can be stubborn. Early tests kept classifying pens and water bottles as "Electronics." We solved this by implementing a Hybrid Guard: a strict system prompt combined with a runtime TypeScript check that overrides the AI if keywords like "ink" or "stationery" are detected.
The Localhost Trap: Connecting a physical phone to a laptop's local AI server is a nightmare of CORS policies and firewalls. We learned to configure Windows PowerShell environments (0.0.0.0 binding) and manage IP bridging dynamically within the app.
The Version Wars: React Native Reanimated v4 clashed with our Web build. We had to perform a surgical downgrade to v3.16 and construct a custom Babel configuration to stabilize the "Scanner Line" animations across platforms.
Accomplishments that we're proud of
Zero-Cost Intelligence: We built a fully functional computer vision pipeline that costs $0 to run. No API keys, no per-token charges. Just raw local compute.
The "Tactical Grid": Making the web map look like a sci-fi radar instead of a generic Google Map embed was a huge aesthetic win that perfectly fits the LAVAPUNK vibe.
Privacy-First Design: By instructing the AI to ignore reading text on ID cards (via prompt engineering), we built safety directly into the ingestion engine.
What we learned
Prompt Engineering is Code: Writing instructions for an LLM is just as critical as writing TypeScript functions. A single word in a prompt can change the entire behavior of the app.
Universal Apps are Hard: "Write once, run everywhere" requires careful handling of platform-specific APIs (like Camera and Maps). Using .web.tsx extensions was a game-changer.
The Power of Edge AI: Running models locally isn't just a gimmick; it offers speed and privacy that cloud services can't match for this use case.
What's next for Asset Hunter
Gamification: Implementing the "Reputation Score" system where users earn "Hunter Badges" for successfully returning items.
Smart Notifications: Using Firebase Cloud Messaging to alert users when an item matching their "Lost" description is scanned by someone else.
Campus Security Integration: An API hook to push "High Value" items (laptops, phones) directly to the campus security dashboard.
Log in or sign up for Devpost to join the conversation.