Inspiration
Social workers are some of the most overworked and underappreciated individuals. They’re constantly moving between shelters, housing units, hospitals, and more to help those in need. Yet, most of their platforms are still largely designed for stationary, desktop based workflows. They have to spend 65% of their time doing paperwork, and have to juggle the burden of 40-60 clients at once.
This is a serious operational problem. A worker may spend time in their car or in the field, but their most important documentation work still has to happen later, often from memory, after an exhausting shift. This means that critical details are forgotten, notes are delayed, and workers carry a huge cognitive burden of just trying to stay compliant.
This becomes even more dangerous when turnover happens. Social workers have a high-burnout profession. When a worker leaves, the nuanced understanding they built over months of what interventions worked, what failed, what triggers a crisis, what de-escalation approach is most effective often disappears with them. The next worker inherits fragments, not a true understanding of the client.
Waypoint was built to solve exactly that. It is designed around the actual field reality of social works: mobile, high-stress, time-constrained, and continuity-sensitive. Instead of forcing workers to adapt to office software, Waypoint adapts to the worker.
What it does
Waypoint is a secure, mobile-first agentic AI case support platform that reduces administrative load while preserving high-quality, legally defensible documentation.
Frictionless Voice Ingestion
After a client visit, a worker can record a voice memo directly from their phone. That specific audio is transcribed and transformed into a structured case note. The system formats the note into objective language, extracts key identities, applies tags, and organizes it into the clients timeline.
This matters because in a field, the biggest barrier to documentation is not willingness; it is energy and timing. Workers often know what happened, but they do not have the capacity to type long notes after a full day of emotionally difficult visits. Voice inputs lower that barrier dramatically.
The note-generation layer is designed around subpoena-safe writing. Instead of vague or emotionally loaded phrasing, it prioritizes:
- observable events
- Reported statements
- Actions taken
- Follow-up items
- objective risk indicators
So rather than saying “client was unstable and aggressive,” the system would structure something like:
- client raised voice during discussion of rent arrears
- Client stated that they had not slept in two nights
- Worker provided housing support referral
- Follow-up visit recommended within 48 hours
That distinction is critical in legal, housing, and health-adjacent contexts.
Drive-Time Audio Recaps
Before entering a visit, especially in a high-risk or sensitive situation, a worker can tap once and hear a short spoken recap of the client’s recent history. This recap is generated from the latest notes, summaries, and relevant interventions, then converted into natural-sounding audio through ElevenLabs.
This is a major workflow unlock because social workers often review context while literally parked outside a building or driving between appointments. Reading through multiple notes on a phone is slow and mentally taxing. Listening is much more natural in that environment.
The recap is designed to answer:
- What happened recently?
- Why is this case urgent right now?
- What interventions have already been tried?
- What should the worker keep in mind going into this interaction?
So instead of opening five previous notes and mentally reconstructing the case, the worker gets a focused briefing in under a minute.
Smart Client Memory
Each client has a persistent memory thread that acts like an evolving case intelligence layer. This is not just a notes database. It is a contextual memory system that allows workers to ask natural language questions and get grounded answers. For example:
- “What housing steps have already been attempted for Alex?”
- “When was the last mental health incident documented?”
- “Has this client missed appointments before?”
- “What follow-up commitments were made last month?”
This is important because traditional case systems are retrieval-heavy and cognition-heavy. The information is technically there, but buried across notes, timestamps, and attachments. Waypoint turns that into actionable recall.
The goal is not to replace professional judgment. The goal is to reduce search time and preserve continuity.
Crisis-Aware Triage
Waypoint continuously surfaces a risk level for each client based on recent documentation patterns. This allows a worker opening the dashboard in the morning to immediately see who may need urgent attention.
The triage logic can account for signals like:
- Missed appointments
- Eviction risk language
- recent hospitalization or mental health escalation
- Loss of shelter placement
- relapse -related indicators
- Unresolved safety concerns
- Gaps in follow-up after prior intervention.
This is useful because workers often manage a large caseload where urgency is dynamic. A static list of names is not enough. They need a living sense of which files are heating up.
The triage layer does not make decisions for the worker, but it helps direct attention where it is most needed.
How we built it
Waypoint is built as a mobile-first Progressive Web App (PWA) using Next.js, React, and Tailwind CSS, explicitly designed for the "Car Dashboard" reality of a social worker's daily routine. The architecture relies on four core pillars:
- Security & Identity (Auth0): Because municipal case notes contain highly sensitive PHI (Protected Health Information), we integrated Auth0 Universal Login. To balance HIPAA/PIPEDA compliance with field convenience, we implemented Passwordless Email OTP.
- Agentic Memory & RAG (Backboard.io): Standard LLMs are stateless, which fails when trying to solve the "hand-off problem" for cases spanning years. We used Backboard as our agent runtime and memory OS. Every client gets a dedicated stateful thread. We utilized Backboard’s separated memory streams to isolate verifiable facts from narrative summaries, ensuring pristine context retrieval.
- Reasoning & Structuring (Google Gemini): Routed through Backboard, we use Gemini Pro/Flash to process voice transcripts. We heavily prompt-engineered the model to act as a clinical social worker, strictly extracting objective facts, direct quotes, and next steps to generate "subpoena-safe" documentation without emotional hallucinations.
- Audio Synthesis (ElevenLabs): We integrated the ElevenLabs API to power our "Drive-Time Audio Recaps." The backend fetches a client's Backboard summary, passes it to ElevenLabs for natural voice synthesis, and streams it back to the PWA so the worker can listen while driving.
Challenges we ran into
- The "Hallucination vs. Liability" Problem: Standard LLMs love to infer emotions (e.g., "The client was angry and unstable"). In social work, case notes are legal documents that get subpoenaed in eviction or child welfare court. We had to strictly constrain Gemini through Backboard's factual ingestion streams to only output observable actions and reported statements.
- Context Fragmentation: Managing a timeline of events where a client might mention a roommate mediation that happened three months ago. Building standard RAG wasn't enough; we had to leverage Backboard's stateful threads to ensure the AI could perform cross-temporal entity resolution (remembering who people are and when past interventions happened).
- Security vs. Usability: Social workers need extreme data security, but typing a 16-character password with special characters while sitting in a car outside an encampment is a terrible user experience. Navigating Auth0's tenant settings to build a flawless, locked-down Passwordless Magic Link/OTP flow was challenging but ultimately solved this perfectly.
Accomplishments that we're proud of
- Achieving "Subpoena-Safe" AI: We successfully built an ingestion agent that writes like a trained clinician. It completely removes the fluff and formats chaotic voice rants into clean, legally defensible bullet points.
- Zero-Data-Entry Scheduling: Our system doesn't just summarize; it extracts future commitments (e.g., "I will drive you to court on Thursday at 8 AM") and automatically injects them into the worker's triage itinerary.
- The Drive-Time Workflow: Seeing the ElevenLabs integration come to life. It transformed the app from just another "AI wrapper" into a deeply empathetic, accessibility-first tool built for the physical reality of frontline workers.
What we learned
- We learned that in healthcare and social services, AI isn't just about saving time with summarization; it's about liability, continuity, and compliance.
- We learned how to leverage Backboard as a true "Agent OS" rather than just a vector database, realizing the power of separating factual data streams from conversational data.
- We gained deep hands-on experience with identity management, learning how to configure authentication properly in secure/regulated fields like this
What's next for Waypoint
What is planned next for Waypoint is to reach out to local social services and experience workers since we already have connections with people that face this problem and have previously faced this issue second hand. Since one of our members already have experience second hand with present social workers and their services and have experience with common apps that are used like Jane and the OWL which act as a scheduling tool for them but it doesn’t take in calls generate summaries and auto schedule to fit the timelines. We will reach out to these local services and incorporate their present cases with our upload features and have our easy to migrate system through mass uploading preexisting case files into Waypoint allows this easy integration process once all case files are uploaded generates “predicted transcripts” and summarzied notes for the case.
Log in or sign up for Devpost to join the conversation.