Inspiration
Only a short year ago, our group of four were freshmen living on campus, unable to have an RA at whim to ask our multitude of questions. Instead, we deliberated and came upon the conclusion that it would've been far easier to access an AI RA in every dorm that acted as a temporary substitute for our busy RA's.
What it does
Simply put, AURA(AI-Powered University Residential Assistant) is an AI-integrated web application capable of answering any question a resident living at a UGA dorm might have, acting as a digital RA. It can schedule meetings with the RA, analyze for recyclability checks that promotes sustainability, links to a myriad of campus resources, uses location-gathering technology to determine residence hall the user lives in, and bases answers on that location data. This type of information AURA provides is streamlined to benefit UGA residence through our AI modeling, and data set training.
How we built it
We relied on Visual Studio Code as our primary code editor and heavily utilized GitHub Copilot to assist us in delivering this project quickly. The front end was designed using Next.js and React, and Tailwind CSS was used to ensure aesthetics were clean across all devices, including desktop and mobile. The back end was built by connecting to the Open Router API to utilize GPT-3.5-turbo and Google Gemini. LangChain was also connected to help our RAG pipeline function properly. All UGA housing policy documents were translated from plain text to JSON objects to assist our chatbot in retrieving actual answers instead of fabricating responses. We utilized a series of vision AI to assist our recycling feature by identifying objects within images and a second feature to classify whether it is recyclable or not. Our appointment scheduling feature makes use of Nodemailer to allow our chatbot to send emails via Gmail's SMTP service. The entire project was designed through an iterative process, where we'd develop a feature, test it, and move on to develop our next feature.
Challenges we ran into
Getting the mobile camera functionality to work well was certainly one of the more significant ones. We tried doing video streaming, which just wasn't very intuitive and always asked us to enable the camera and so forth. In the end, we went with the camera picker native to the phone, which was much better. Another problem we encountered was the dreaded 413 error due to the fact that the phone cameras produce enormous pictures. This meant that we're going to need to client-side compress the pictures before we send them over. We also experienced an annoying problem where our scheduling feature would fail silently when saying an “Invalid request.” This turned out to be due to the frontend and backend having different field names. It turned out our error handling was too generic and wouldn’t indicate that that was the problem. We also experienced an annoying problem where the read-only filesystem on Vercel wasn’t working well for us, so we're going to go ahead and implement that best-effort style and hope that the email still sends regardless.
Accomplishments that we're proud of
The recycling checker thing is really cool, honestly – all you have to do is point your phone camera at something and it will tell you whether or not it's recyclable, compostable, or trash and how to get rid of each thing correctly. They're using two different forms of AI that are stacked back to back: one to identify the thing and another to classify it. Another major one was the RAG chatbot thing. Instead of giving generic answers to questions that any old AI could be programmed to answer, this thing actually pulls its answers from eight real UGA policy documents – community guide, code of conduct, academic honesty policy, etc., etc., etc. So if a student asks, for example, about quiet hours or the guest policy, for once, they will get the actual policy instead of some fever dream of how the AI "imagined" the policy to be. It even has some information about things unique to each building, such as your front desk phone number if the user will tell it what dorm they're in.
What we learned
First and foremost, we learned a ton about how RAG actually works in practice, e.g., how to structure knowledge bases, direct queries to the correct docs, and inject contextual info to keep the AI grounded in reality. Also new to most of us was working directly with vision-based AI, especially when using multiple models (look: Gemini vision + classify: GPT). Of course, we also learned the hard way to never design for desktop first, as we essentially had to recompose our layout to include a tab bar at the bottom to actually make it accessible on phones. And finally, there's all the stuff you'd never think about unless it breaks on you, such as rate limiting, validating user input, and working through platform-level requirements (like Vercel's read-only filesystem).
What's next for AURA?
Right now the knowledge base resets when the server restarts since it's all in memory, so hooking up a real vector database like Pinecone would make it persistent and faster. We want to add UGA login so students automatically see info for their specific dorm without having to type it in. An admin dashboard for RAs to upload new policy documents would make it way more maintainable. And we'd love to add real-time streaming so you can see the AI typing its response instead of waiting for the whole thing to load. Eventually, we think this could work at any university — not just UGA — if we generalize the platform so other housing departments can plug in their own documents.
Log in or sign up for Devpost to join the conversation.