What inspired us Despite the prevalence of LLMs increasing, their power still hasn't been leveraged to improve the experience of students during class. In particular, LLMs are often discouraged by professors, often because they often give inaccurate or too much information. To remedy this issue, we created an LLM that has access to all the information for a course, including the course information, lecture notes, and problem sets. Furthermore, in order for this to be useful for actual courses, we made sure for the LLM to not answer specific questions about the problem set. Instead, the LLM guides the student and provides relevant information for the student to complete the coursework without providing students with the direct answer. This essentially serves as a TA for students to help them navigate their problem sets.

What we learned Through this project, we delved into the complexities of integrating AI with software solutions, uncovering the essential role of user interface design and the nuanced craft of prompt engineering. We learned that crafting effective prompts is crucial, requiring a deep understanding of the AI’s capabilities and the project's specific needs. This process taught us the importance of precision and creativity in prompt engineering, where success depends on translating educational objectives into prompts that generate meaningful AI responses.

Our exploration also introduced us to the concept of retrieval-augmented generation (RAG), which combines the power of information retrieval with generative models to enhance the AI's ability to produce relevant and contextually accurate outputs. While we explored the potentials of using the OpenAI and Together APIs to enrich our project, we ultimately did not incorporate them into our final implementation. This exploration, however, broadened our understanding of the diverse AI tools available and their potential applications. It underscored the importance of selecting the right tools for specific project needs, balancing between the cutting-edge capabilities of such APIs and the project's goals. This experience highlighted the dynamic nature of AI project development, where learning about and testing various tools forms a foundational part of the journey, even if some are not used in the end.

How we built our project Building our project required a strategic approach to assembling a comprehensive dataset from the Stanford CS 106B course, which included the syllabus, problem sets, and lectures. This effort ensured our AI chatbot was equipped with a detailed understanding of the course's structure and content, setting the stage for it to function as an advanced educational assistant. Beyond the compilation of course materials, a significant portion of our work focused on refining an existing chatbot user interface (UI) to better serve the specific needs of students engaging with the course. This task was far from straightforward; it demanded not only a deep dive into the chatbot's underlying logic but also innovative thinking to reimagine how it interacts with users. The modifications we made to the chatbot were extensive and targeted at enhancing the user experience by adjusting the output behavior of the language learning model (LLM). A pivotal change involved programming the chatbot to moderate the explicitness of its hints in response to queries about problem sets. This adjustment required intricate tuning of the LLM's output to strike a balance between guiding students and stimulating independent problem-solving skills. Furthermore, integrating direct course content into the chatbot’s responses necessitated a thorough understanding of the LLM's mechanisms to ensure that the chatbot could accurately reference and utilize the course materials in its interactions. This aspect of the project was particularly challenging, as it involved manipulating the chatbot to filter and prioritize information from the course data effectively. Overall, the effort to modify the chatbot's output capabilities underscored the complexity of working with advanced AI tools, highlighting the technical skill and creativity required to adapt these systems to meet specific educational objectives.

Challenges we faced Some challenges we faced included scoping our project to ensure that it is feasible given the constraints we had for this hackathon including time. We learned React.js and PLpgSQL for our project since we had only used JavaScript previously. Other challenges we faced were installing Docker, Supabase CLI, and ensuring all dependencies are properly managed. Moreover, we also had to configure Supabase and create the database schema. There were also deployment configuration issues as we had to integrate our front-end application with our back-end to ensure that they are communicating properly.

Built With

  • plpgsql
  • react.js
  • supabase
  • vercel
Share this project:

Updates