Inspiration
We’re nerds. When we go online shopping, we’re looking at all the specs, all the different utilities, and all the glaring (or glowing) reviews each product has. Each one of us has put together an excel spreadsheet ranking the same product against different manufacturers, trying to find the best bang for our buck. All that time spent researching is tedious, exhausting, and taking away time that could be spent talking to our AI girlfriends. So we’ve come up with our own agent, Wishlist, that will reduce our browsing time, allowing us to spend more time doing the things that we love (like messaging our AI girlfriends <3).
What it does
TLDR: You tell Wishlist 🐨 what you want, your requirements, and budget. Wishlist finds the top results that suit you best, more accurately than any LLM model on the market.
Wishlist is an AI agent that scrapes the web using your requirements and picks out as many relevant products as possible. It then takes all that data, compares it to your requirements, and ranks each product accordingly. With that list, Wishlist then takes the top few products it thinks you might be interested in, and presents them to you in nifty infographic cards. You can flip cards to learn more about the specs of the product, and if you’re interested in making a purchase, a link is there for your convenience.
How we built it
Our frontend stack is React + NextJS and our interface was designed in Figma. User prompts are sent to our Python + Flask backend via a POST request. We then use Selenium headless browsing to browse the web for items related to the user's prompt. There, each item's data is scraped using BeautifulSoup, generating a sizeable array of JSON objects. Then, the data is cleaned and fed to Cohere's Command-A model, which determines the 3 best items in terms of price, quality, ratings, and company reputation. Finally, the items are displayed on our frontend and stored in our AWS DynamoDB database for a "purchase history" feature. For deep analysis, an AI Agent built with Cua + Claude Code is used to collect data in place of Selenium. We used Windsurf and Claude Code to aid us throughout the development process.
Challenges we ran into
- Initially, we planned to dispatch an AI agent (Cua) as our main method of data retrieval. However, the agent was bottlenecked by speed; it took ~3 minutes for the agent to read 4 items, plus an additional minute to navigate to a website of its choice. For this reason, we decided that the old-fashioned web scraping approach was more suited to the task.
- We also learned many unfamiliar technologies such as DynamoDB, Cua, and Cohere's Command-A model.
Accomplishments that we're proud of
- Successfully building an AI agent capable of navigating online storefronts and extracting data
- Implementing a fully working web scraper
- Our clean and intuitive user interface!
What we learned
- How to create Computer Use Agents
- Web scraping strategies and optimizations
- Brainstorming ideas is difficult
What's next for Wishlist
Wishlist plans to allow users to enter broader and more complicated budgets, such as planning for a camping trip, then setting up subagents for each shopping task. This would allow each sub agent to work with a shared budget.
At a higher level, Wishlist could be used as a planning and management agent in financial roles, such as purchasing machinery and materials. We envision Wishlist being used to generate multiple budgets within a short period of time, offering managers, directors, and officers a high-level overview of their organization, and explore different budget configurations with different goals in mind.
Built With
- anthropic
- claude
- cohere
- cua
- docker
- dynamodb
- javascript
- next.js
- python
- react
- selenium
- windsurf





Log in or sign up for Devpost to join the conversation.