Gradient
Everyone exists on a gradient. Now everyone can build on one too.
VIEW PROJECT DOCS! (pls click me and view pitch deck)
We had the perfect team combination, a match made in heaven even if none of us had > 1 hackathon experience. So when you have the Raven Glasses that aligns exactly with your ethos, your skill sets, and
everything you've built before, it inspires you to think bigger (if not BETTER). None of that silly hackathon
"change the world" talk, but genuinely envisioning what the future could look
like for every individual. Creating solutions that stand among the GREATS like our favoirte tool Claude.
But before we could build anything, we had to reframe the entire question.
The hackathon prompt pushed us toward "accessibility tools" and our first
instinct was to build something that helps people with disabilities. But that
framing felt wrong. It felt like we'd be building for people rather than
with them. So we stopped, stepped back, and asked ourselves a different
question:
Why are we treating 1.3 billion people like they're identical?
Two developers don't write the same code. Two artists don't create in the same
medium. Yet when we build accessibility tools, we slap on clinical
labels—"autism mode," "ADHD mode"—and call it a day. That's simply just categorization.
The reframe that unlocked everything: All people exist on spectrums. All
people are people. Why not let everyone customize a solution based on their
unique position as a human? and adapt to those needs.
This is part of the human experience. The people who would benefit most from
XR are the ones who can build it for themselves.
That's why we made Gradient.
What it does
Gradient is a personalization layer that empowers people with accessibility
needs to dream up any AR application NO CODE, NO STIGMA, and without
one-size-fits-all limitations.
Instead of asking "What's your disability?", we ask "How do you experience the world?"
Through natural questions about sensory preferences, cognitive patterns, and
daily challenges, we generate a fully custom AR tool tailored to each user's
unique combination of needs.
We solved a huge stigmatization problem while simultaneously enhancing the
app-building experience for everyone whether you have specific accesibillity needs or not,
whether you're on the spectrum or not. It's the ultimate invisible personalization
layer: a "Build Your Own Agent" experience for AR.
How we built it
This was an insanely complicated workflow involving orchestration of many many many many data servers and llm crunching.
- AI Orchestrator: A LangGraph-powered pipeline that takes natural
language input and translates it into AR application specifications - Dynamic Generation: Custom UI layouts, features, and interaction
patterns generated based on user profile - Raven Glass Integration: Built specifically for Raven's eye-tracking
capabilities and Linux-based platform - Experience-First Onboarding: A non-clinical question flow that maps user
responses to features without stigmatizing language
See our flowchart for the full orchestration architecture.
Challenges we ran into
Getting onboarding right: The onboarding system needed to extract
intent and data accurately while asking the right questions without making it seem
like we're fishing for a predetermined solution. Finding that balance took
multiple iterations.Prompt engineering at scale: Normal prompting doesn't cut it. We make
assumptions for you, kind of like every UI component needing features to display its
data. Orchestrating purpose, pain points, primary features, and visualization while utilizing templates was genuinely difficult.Documentation digestion: The system is so complex that we needed LLMs
to help digest our own documentation. Building something this intricate
required extremely clear documentation just to keep track of everything.
Accomplishments that we're proud of
- Built onboarding software that truly leverages AI as a tool for human use, the
integration layer between human and AI that we can proudly show off
- Created a working prototype that generates end-to-end in approximately 45
seconds, producing meaningful AR tools based on user input - Fully understood how to integrate with Raven hardware—this is the SaaS layer
that can enable so much more for their device
- Developed architecture and orchestration patterns that would be very
difficult to replicate in such a short timeframe - Designed for an underserved market and opened up a new gap in the space
What we learned
- People don't want accessibility tools—they want TOOLS. The framing
matters as much as the functionality. - The spectrum is the insight. Treating accessibility as a gradient rather
than binary categories unlocks entirely new design possibilities.
- Scale is wild. We learned to manage 9 Claude context windows
simultaneously, with each running close to 36 sessions in parallel.
What's next for Gradient
- Expand the component library: More features and UI components means
endless possibilities - User testing across the spectrum: Test with people across accessibility
needs to refine prompts for a broader audience
- Deeper Raven Glass integration: We want to use the IMU for nodding-based
micro-interactions to proceed through flows
- Partnerships: Explore collaborations with accessibility advocacy
organizations to bring XR into day-to-day life - The vision: Build toward a future where every person can dream up the XR
tools they need—designed with utility, accessibility, and human happiness in
mind
open to hire! (some of us at least)
Built With
- claude
- dedication
- dreams
- hope
- langgraph
- openai
- openrouter
- p5.js
- passion
- pydantic
- python
- raven-framework
- support
Log in or sign up for Devpost to join the conversation.