Inspiration
With the whole world being forced online, we wanted to help people connect by creating a simplistic, easy-to-use PC-building website. Many PC building websites tend to use terms that beginners would not be able to understand while also either having the user select all their parts themselves or to have a singular randomly generated build based on very few factors.
What it does
Our Auto PC builder automatically compiles a list of viable components in real-time based on a user's budget. This is more than just a simple hardcoded program, our program uses an API that connects to a sophisticated scraper that gathers data on almost every component ever released since 2005 using PCPartPicker, and from as many regions as the user desires. We completed an overhaul of the web scraping system the API accessed, in order to get more relevant data for our project. Using this, we can get the live data of thousands of components at once, and decide which component best suits the user's needs!
How we built it
All the backend was done using python 3.10 and a vastly modified and improved version of https://github.com/JonathanVusich/pcpartpicker.
(See Imports At Bottom)
The project prototype was first drafted on Figma and then coded using HTML and CSS to create a basic draft website.
Challenges we ran into
Amy: This was my first time working with Figma which was quite challenging in itself. One of the main challenges I ran into while using Figma was trying to make animations and interactions between specific objects for prototyping.
David: As a relatively new coder, I found the task of learning multiprocessing, caching, lxml/scraping, selenium/chromedriver, and navigating around the PC Part Picker DDoS protection very challenging to handle and learn for the first time. On top of this, the code I based my scraper on was running a vary outdated python version, meaning much of the code needed to be rewritten. A demon that I ran into was the unfortunate case where my chromedriver package, which enabled cloudflare bypass, ended up using multiprocessing which meant that I couldn't scrape using more than 1 instance at once! This was because daemonic processes cannot spawn children, so I ended up having to find a way to implement a pool that was non-daemonic, a little scuffed, but I eventually got it working.
Mercy: Worked on the front end, wanted to use JS to manipulate JSON files and put them into a database.
Jagrit: Dealing with the way the API interfaced with the dataset was difficult, as it took a lot of in-depth understanding of an API that we had never worked with before. We spent a whole day modifying the API and succesfully implemented it.
Accomplishments that we're proud of
David: This was my first time writing up a shoddy database and creating such a large scraping program. I'm proud that it functions as intended. Definitely had a huge boost to my knowledge in the last 48 hours! One issue in the backend was how slow the web scraping was. In order to speed it up, we would have to use multiprocessing; however, the Pool class creates problem. So, we created a whole new class that mimicked Pool functionality. Mercy: Coded a nice front end on a time crunch.
What we learned
Amy: I learnt much about creating a project prototype on Figma during the hackathon. I also helped Mercy in the front end portion of creating our website which challenged me to learn HTML and CSS.
David: I ended up learning much of python multiprocessing as not using it made the scraping take too long (over 30 minutes)! Some minor accomplishments were learning how to use selenium and lxml scraping while also practicing good coding. A regret was that I did not comment my code as I was too busy writing it. (But it's ok, no one will see it anyways.)
Mercy: Always good to practice web dev, always something new to learn, had to adapt due to time crunch but ended up okay.
Jagrit: One issue in the backend was how slow the web scraping was. In order to speed it up, we would have to use multiprocessing; however, the Pool class creates problem. So, we created a whole new class that mimicked Pool functionality.
What's next for QHACKS2022
If we had more time to work on the project during the hackathon, we would definitely want to finish implementing the prototype into website form. Another possible addition is to make a user database so that users can create an account and save their PC builds. It would also be interesting to make a community on the website for people to share their builds and also give helpful tips and tricks.
Built With
- css
- css-bootstrap
- figma
- html
- javascript
- modified-pc-partpicker-api
- python



Log in or sign up for Devpost to join the conversation.