Inspiration

Our project was inspired by the everyday challenges patients face, from remembering to take their medication to feeling isolated in their health journey. We saw the need for a solution that could do more than just manage symptoms—it needed to support patients emotionally, help prevent medication mistakes, and foster a sense of community. By using AI and creating a space where patients can connect with others in similar situations, we aim to improve not only their health outcomes but also their overall well-being.

What it does

Our project helps patients stay on track with their medication by using Apollo, our assistant that reminds them and tracks how they're feeling. It keeps a journal of their mood, sentiment, and actions, which can be shared with healthcare providers for better diagnosis and treatment. Users can also connect with others going through similar challenges, forming a supportive community. Beyond that, the platform helps prevent errors with prescriptions and medication, answers questions about their meds, and encourages patients to take an active role in their care—leading to more accurate diagnoses and reducing their financial burden.

How we built it

We built multiple components so that everyone could benefit from our voice assistant system. Our voice assistant, Apollo, reads the user's transcript using OCR and then stores it in a DB for future retrieval. The voice assistant then understands the user and talks to them so that it can obtain information while consoling them. We achieved this by building a sophisticated pipeline involving an STT, text processing, and TTS layer. After the conversation is done, notes are made from the transcript and summarized using our LLM agents, which are then again stored in the database. Artemis helps the user connect with other individuals who have gone through similar problems by using a sophisticated RAG pipeline utilizing LangChain. Our Emergency Pipeline understands the user's problem by using a voice channel powered by React Native, evaluates the issue, and answers it by using another RAG-centric approach. Finally, for each interaction, a sentiment analysis is done using the RoBERTa Large Model, maintaining records of the patient's behavior, activities, mood, etc., in an encrypted manner for future reference by both the user and their associated practitioners. To make our system accessible to users, we developed both a React web application and a React Native mobile app . The web app provides a comprehensive interface for users to interact with our voice assistant system from their desktop browsers, offering full functionality and easy access to conversation history, summaries, and connections made through Artemis. The React Native mobile app for Emergencies brings the power of our voice assistant to users' smartphones, allowing them to seek help easily in case of an emergency

Challenges we ran into

One of the key challenges we faced was ensuring the usability of the system. We wanted to create an intuitive experience that would be easy for users to navigate, especially during moments of mental distress. Designing a UI that is both simple and effective was difficult, as we had to strike the right balance between offering powerful features and avoiding overwhelming the user with too much information or complexity.

Accomplishments that we're proud of

One of the biggest accomplishments we’re proud of is how accessible and user-friendly our project is. We’ve built an AI-powered platform that makes managing health easier for everyone, including those who may not have easy access to healthcare. By integrating features like medication reminders, mood and sentiment tracking, and a supportive community, we’ve created a tool that’s inclusive and intuitive. Our platform bridges the gap for those who may struggle with traditional healthcare systems, offering real-time answers to medication questions and preventing errors, all while fostering patient engagement. This combination of accessibility and smart features empowers users to take control of their health in a meaningful way, ensuring patient safety.

What we learned

Throughout this project, we gained valuable experience working with new APIs that we had never used before, which expanded our technical skills and allowed us to implement features more effectively. We also learned how to better manage project progress by setting clear goals, collaborating efficiently, and adapting to challenges as they arose. This taught us the importance of flexibility and communication within the team, helping us stay on track and deliver a functional product within the tight timeframe of the hackathon.

What's next for SafeSpace

In the future, we plan to enhance the platform with a strong focus on patient safety by integrating a feature that checks drug interactions when a prescription is provided by a doctor, ensuring the well-being of patients and preventing harmful combinations. Additionally, we aim to implement anti-hallucination measures to prevent false diagnoses, and safeguarding the accuracy of the assistant’s recommendations and promoting patient safety. To further protect users, we will incorporate robust encryption techniques to securely manage and store sensitive data, ensuring the highest level of privacy and security for patient information.

Built With

Share this project:

Updates