Inspiration
Healthcare information can be overwhelming and hard to navigate. When someone searches for symptoms or conditions online, they're often met with scattered results, medical jargon, and no clear path forward. We wanted to create something different—a healthcare companion that combines the precision of search with the empathy of conversation.
The Elastic + Google Cloud hackathon inspired us to push the boundaries of what's possible when you merge hybrid search technology with generative AI. We asked ourselves: "What if searching for health information felt like talking to a knowledgeable friend who also had access to a medical library?" That question became Neural RX.
What it does
Neural RX is an AI-powered healthcare information discovery platform that makes finding medical information natural and intuitive. Here's how it works:
- Conversational Interface: Users ask health questions in plain language—no medical terminology required
- Hybrid Search: Elasticsearch combines keyword matching with semantic understanding to find the most relevant medical conditions from our healthcare database
- AI-Powered Responses: Google Gemini AI analyzes the search results and generates personalized, contextual responses that directly answer the user's question
- Intelligent Features:
- Voice Input: Hands-free interaction using Web Speech API
- Context-Aware Chat: Remembers previous questions for natural follow-ups
- Bookmarks: Star important messages for quick reference
- Export: Download entire conversations in PDF, text, or JSON format
- Quick Actions: One-click access to common health topics
- Dark Mode: Easy on the eyes during late-night health research
The result? A healthcare search experience that feels less like Googling and more like having a conversation with a medical reference guide.
How we built it
Technology Stack:
- Frontend: React + TypeScript, Tailwind CSS, Shadcn UI
- Backend: Express.js server with TypeScript
- Search: Elasticsearch (via Elastic Cloud) for hybrid search
- AI: Google Gemini 2.5 Flash for natural language understanding and response generation
- Data Storage: In-memory storage for chat history and analytics
Architecture:
Data Model: We curated a database of 12 major medical conditions spanning categories like Cardiovascular, Respiratory, Endocrine, Mental Health, and more. Each condition includes symptoms, treatments, severity ratings, and detailed descriptions.
Hybrid Search Pipeline: When a user asks a question, we use Elasticsearch's multi-field matching across condition names, descriptions, symptoms, and treatments. The hybrid approach combines traditional keyword search with semantic relevance scoring.
AI Integration: Gemini AI receives both the user's question and the top search results as context. It maintains conversation history (last 3 exchanges) to provide contextual, personalized responses while always reminding users to consult healthcare professionals.
User Experience: We designed a split-panel interface—chat on the left, search results on the right—so users can see both the conversational AI response and the structured medical data simultaneously.
Development Process:
- Started with the data schema to ensure consistency between frontend and backend
- Built the Elasticsearch integration and seeded the healthcare database
- Integrated Google Gemini AI with context-aware conversation handling
- Implemented advanced features (voice, bookmarks, export) to enhance usability
- Designed a healthcare-focused UI with accessibility in mind
Challenges we ran into
Elasticsearch Authentication: We initially struggled with serverless vs. standard deployment authentication. Learning the difference between Cloud ID formats and API key permissions took debugging and documentation diving.
Context Management: Balancing conversation history for Gemini AI was tricky—too much context and responses became slow/expensive, too little and follow-up questions lost meaning. We settled on maintaining the last 3 exchanges.
Voice Input Reliability: The Web Speech API works differently across browsers. We had to implement robust error handling and visual feedback to make the experience smooth.
Bookmark State Management: Initially, bookmarks didn't persist across chat messages because state wasn't shared properly. We solved this by implementing a React Context Provider pattern to centralize bookmark management.
Scroll Behavior with Radix UI: The ScrollArea component required special viewport handling to detect overflow and implement a scroll-to-bottom button. Standard DOM methods didn't work—we had to access the Radix viewport ref directly.
Accomplishments that we're proud of
🎯 Seamless Integration: Successfully combining Elasticsearch's hybrid search with Gemini AI to create responses that are both accurate and conversational
🎨 Professional UI/UX: Built a healthcare application that's beautiful, accessible, and intuitive—not cluttered or overwhelming like many medical websites
🗣️ Voice-First Experience: Implemented hands-free interaction that makes healthcare research accessible to users with mobility challenges or those multitasking
📊 Structured + Conversational: Users get the best of both worlds—AI conversation on the left, structured medical data cards on the right
♿ Accessibility: WCAG AA compliant design with keyboard navigation, screen reader support, and thoughtful color contrast
⚡ Performance: Sub-8-second responses even with full search + AI generation pipeline
What we learned
Technical Insights:
- How to architect hybrid search systems that balance keyword precision with semantic understanding
- The nuances of conversation history management in generative AI applications
- Browser API limitations and progressive enhancement strategies (Web Speech API)
- State management patterns for complex React applications (Context API)
Product Insights:
- Healthcare users need both accuracy (structured data) and empathy (conversational responses)
- Discoverability matters—quick action buttons dramatically improve engagement for new users
- Export functionality is crucial for healthcare—users want to save and share information with family or doctors
Hackathon Insights:
- Start with the data model first—everything else flows from there
- End-to-end testing reveals bugs that unit tests miss
- Good documentation (replit.md) saves hours when context-switching
What's next for Neural RX
🔬 Expand Medical Database: Add hundreds more conditions, medications, procedures, and preventive care information
🧬 Vector Search: Implement true semantic search using embeddings for even better result relevance
👤 User Accounts: Enable users to save chat history, bookmarks, and preferences across devices
📊 Analytics Dashboard: Visualize search patterns to identify trending health concerns and improve the knowledge base
🌍 Multilingual Support: Make healthcare information accessible in multiple languages
🔗 EHR Integration: Connect with Electronic Health Records (with proper consent) for personalized health guidance
🏥 Provider Directory: Integrate healthcare provider search to help users find doctors for their specific conditions
📱 Mobile App: Native iOS/Android apps for on-the-go health research
🤖 Continuous Learning: Implement feedback loops to improve search relevance and AI response quality over time
Neural RX represents the future of healthcare information discovery—where the precision of search meets the understanding of AI to help people make informed decisions about their health. We're excited to continue building and making healthcare information more accessible to everyone.
Built With
- css
- elastic
- elasticsearch
- express.js
- gemni
- react
- shadcn
- tailwind
- typescript
- ui
Log in or sign up for Devpost to join the conversation.