Inspiration
Imagine losing the ability to button your shirt or hold your grandkid's hand. EAS&E helps arthritis patients fight back with AI-guided support and computer vision hand tracking that makes therapeutic exercises actually stick. This is why we wanted to create a simple, interactive web app which can help people practice wrist movements, posture movement, and practice fine motor exercises just using their camera. We wanted to combine guided exercises, real-time hand tracking, and an AI assistant to help users learn more about arthritis and practice different exercises to alleviate their pain.
What it does
EAS&E is a web based tool that provides real-time wrist and finger tracing exercises, one can also create custom exercises to focus more on specific movements catered to their pain region. We have an AI Arthritis information Assistant called Eva which is powered by ElevenLabs. It is a voice-based RAG assistant which provides more information about arthritis and can also guide the user on how to use the website. We have a speech-to-text enabled to make it as user friendly as possible with simple instructions and large buttons.
How we built it
On the frontend, we used React.js, Node.js, Express.js, Cors, Firebase, Railway, TailwindCSS, and Framer motion. We structured our custom shape tool to be saved to a NoSQL Firebase.js backend. For the hand tracking and tracing we used MediaPipe hand model for the finger detection and pose model for posture exercises. Then for our voice based AI assistant we used ElevenLabs conversational AI and made it RAG based. Then we used Gemini to create a report for the user’s session.
Challenges we ran into
We had to make sure the finger detection takes place properly, which required a lot of careful calculations. We also had to make sure we make the web app as user friendly as possible for people with arthritis, so we tried to find ways to implement a layout which required the least amount of typing from the user. Furthermore, we had minimal experience with the technologies we were using, and we spent a lot of time learning everything as we went.
Accomplishments that we're proud of
We are proud of being able to build a fully functional hand tracking exercise which one can trace easily. We also implemented an AI chatbot to help guide users through the websites and tell them more about how to alleviate arthritis pain.
What we learned
We learned how to work with a real-time video processing and browser camera API and how to integrate Mediapipe hands for gesture detection and tracing. We also learned how to make UI/UX considerations for accessibility and comfort.
What's next for EAS&E
We plan on adding more exercises, making the UI/UX even more User friendly. We want to later add logins and have a saved progress feature to make it more specific to a single user.
Built With
- cors
- elevenlabs
- express.js
- firebase
- gemini
- mediapipe
- node.js
- railway
- react.js
- tailwind.css

Log in or sign up for Devpost to join the conversation.