What Inspired It The idea came from a pretty straightforward observation. Some students grow up knowing how to interview because they have access to coaching, professional networks, and parents who prep them. Students from underfunded schools often don't have any of that. Interview skills are largely treated as something you just figure out on your own, which isn't realistic for everyone. That inequality in access is what pushed us to build this.
How We Built It We focused on building something that gives real, specific feedback rather than surface level responses. NextRound simulates interview scenarios and provides structured guidance on things like answer clarity, response structure, and communication patterns. We also incorporated school district and broadband access data to ground the tool in the actual communities that need it most. The technical side moved fast given the time constraints, so we had to make quick decisions about what to prioritize.
What We Learned Feedback design matters more than we initially expected. Early on, the responses the tool generated were technically accurate but not particularly useful. Telling someone to "be more clear" or "structure your answer better" does not give them anything concrete to work with. We shifted toward specificity, things like flagging when an answer lacked a clear outcome, or when someone spent too much time on context and not enough on their actual contribution. That change made a noticeable difference in how useful the practice sessions felt. We also learned that the framing of feedback changes how people receive it. The same note lands differently depending on how it's worded, so we spent time thinking about language almost as much as functionality. We also track eye contact and tonality.
The Challenges The main challenge was making the feedback feel constructive without being generic. Interview prep is already a high pressure process for a lot of people, and a tool that comes across as cold or overly critical would push users away rather than help them improve. We needed the feedback to be honest and direct while still feeling like something a good mentor would say, not just an automated response. Getting that tone consistent across different question types and answer styles took more iteration than most of the technical work. Time was the other obvious constraint. There were features we scoped out early, like a progress tracking dashboard and a interview facing view, that we had to set aside to keep the core experience functional and well tested.
Next Steps The most immediate priority is building out a progress tracking layer so students can actually see improvement over time rather than treating each session as isolated. Right now the feedback exists in the moment but there is no long term view, and that limits how useful the tool is for sustained practice. This would include a lighter interface, offline access to practice questions, or downloadable feedback summaries. Equity of access means making sure the tool actually works for the communities it is designed to serve, not just communities with reliable internet. On the institutional side, a counselor dashboard would let school staff monitor student activity, identify who needs more support, and use the tool as part of a broader college and career readiness program. That would move NextRound from a standalone app into something that fits inside existing school infrastructure.
Built With
- livedatatechnologies
- openai
- typescript
Log in or sign up for Devpost to join the conversation.