Overview
Index is an AI-driven, voice-based student assistant platform developed during our time at Johns Hopkins University. Designed to work with Google Assistant and smart devices, Index enables students to interact with critical university systems using natural language, making it possible to check book availability, register for classes, look up course schedules, and get real-time campus information using just their voice.
Index bridges multiple fragmented academic systems like library databases, course registration portals, and student information systems, into a single, voice-controllable interface powered by conversational AI.
Problem
University systems are often siloed and not user-friendly, forcing students to navigate clunky portals for simple tasks like finding books, registering for classes, or checking course prerequisites. There was no unified interface that allowed seamless interaction across services, nor did any existing system support voice-based queries or natural interaction.
What It Does
Index acts as a unified AI assistant for university life, accessible via Google Assistant on phones and smart speakers. Key capabilities include:
Library Search: Get exact shelf location, availability, and summaries of books
Course Lookup: Search courses by name, department, or professor
Class Registration: Register or drop classes (integrated with SIS where permissions allow)
Campus Info Access: Retrieve building hours, event schedules, and faculty office hours
Voice-to-Email: Send results, course schedules, or links directly to student email
Personalized Interaction: Tailors responses based on student role (undergrad, grad, etc.)
This turns the scattered experience of accessing academic resources into a seamless, voice-first workflow.
How we built it
Web Scraping + Reverse Engineering: Since JHU didn’t expose public APIs, we built Python-based scrapers and reverse-engineered campus systems (library, course catalogs, SIS frontend).
API Development: Created a Flask-based REST API as a unified interface for all scraped and static data.
Google Assistant Integration: Used Google Actions SDK and Dialogflow to handle voice commands, map intents, and deliver conversational responses.
Security & Context Awareness: Integrated user authentication and context-based flow switching to handle sensitive tasks like registration.
Technical Highlights
End-to-end integration between voice interface and legacy university systems
Modular backend designed for multi-university scaling
Early implementation of zero-UI design in education tech
Extensible intent framework to support future features (e.g., transcript access, TA booking)
Challenges
Lack of APIs meant scraping and interpreting inconsistent HTML structures
Mapping ambiguous voice queries to structured academic data
Ensuring accuracy and error handling in high-stakes flows like course registration
Complying with authentication and data-access constraints across platforms
Broader Impact
Index demonstrates how AI assistants can meaningfully improve accessibility, user experience, and productivity in academic settings. It aligns with larger trends in voice-driven computing, AI for education, and assistive technologies.
Empowers students with disabilities or time constraints
Reduces friction in accessing administrative and academic services
Promotes the idea of human-centered design for institutional software
Future Plans
We envision Index evolving into a university-wide platform that supports:
Full integration with learning management systems (LMS like Canvas/Blackboard)
Conversational tutoring (LLM-based Q&A on syllabi and lectures)
Voice-activated forms and campus service requests
Automated push notifications for academic deadlines and alerts
Built With
- flask
- google-actions
- natural-language-processing
- php


Log in or sign up for Devpost to join the conversation.