Inspiration

The healthcare sector generates a vast amount of data, yet extracting actionable insights from patient records remains a challenge. HyperHealth Voice is inspired by the need for a smarter, more intuitive system to empower doctors with rapid access to crucial patient information using natural voice commands. By integrating knowledge graphs with advanced AI, we aim to bridge the gap between data complexity and accessibility in healthcare.

What it does

HyperHealth Voice enables doctors to upload patient reports in PDF format and interact with the system through voice commands to retrieve information such as blood pressure, sugar levels, or diagnosis history. Key features include: 1) Voice Command Support: Seamless query execution using voice for natural and efficient interactions. 2) Patient Reports Parsing and Storage: Converts uploaded medical reports into structured text, stores them as vector embeddings using the Hypermode-hosted embeddings model, and links data to a Neo4j knowledge graph. 3) Intelligent Search: Uses Neo4j vector search to provide precise and context-aware answers to complex medical queries. 4) Knowledge Graph Insights: Leverages Neo4j to visualize relationships between symptoms, treatments, and patient history.

How we built it

1) Frontend: Built with Next.js to create an intuitive and responsive user interface for report uploads and voice interactions. 2) Backend: Developed using the Modus API Framework to enable interaction between AI models and the Neo4j knowledge graph. 3) Data Processing: a) Vector Embeddings: Created embeddings using a hypermode hosted mini-llm model to represent patient data semantically for efficient vector search. b) Knowledge Graph: Constructed a Neo4j graph database to capture relationships between patient details, symptoms, and medical history, Integrated it with Modus for seamless interaction with AI queries. 5) Voice Search: Enabled natural language processing to convert voice commands into actionable queries.

Challenges we ran into

1) Using Modus APIs in AssemblyScript: Some documentation for Modus is lacking, and dependencies are incorrectly set up, requiring us to figure things out by reading the modus-sdk code. 2) Knowledge Graph Complexity: Mapping intricate relationships within patient data to a knowledge graph demanded robust schema design and optimization.

Accomplishments that we're proud of

1) Successfully implemented a system that combines the power of knowledge graphs, Hypermode vector embeddings, and the Modus API framework for healthcare. 2) Achieved accurate and context-aware voice search for patient data queries. 3) Developed a scalable and intuitive interface for doctors, reducing their cognitive load. 4) Leveraged Neo4j to provide transparency in data relationships and enable complex insights.

What we learned

1) The power of the Modus API framework and how to use Modus to perform complex Cypher queries and RAG operations. 2) Techniques for optimizing PDF text extraction and creating robust embeddings for large-scale data.

What's next for HyperHealth Voice

1) Broader Dataset Integration: Extend support for additional data formats like imaging reports (e.g., X-rays, MRIs). 2) Real-Time Monitoring: Enable integration with IoT devices to track patient vitals in real-time.

Built With

  • embeddings
  • hypermode
  • modus
  • neo4j
  • nextjs
Share this project:

Updates