Inspiration
One of our friends is highly sensitive to online content containing gore, violence, or spiders. If they see this content when they're not expecting it, it disgusts and scares them. One time, they threw their laptop across the room when they saw a close-up image of a tarantula. Thus, we wanted to build an application that would not only automatically filter gory content, but also any phobias or any form of content that the user may not want to see.
What it does
SafeBrowse uses generative AI to identify and blur parts of any webpage based on how likely it is to affect someone based on their phobias, dislikes, and other factors. It works to filter out both sensitive text and images. It blurs every relevant part of the website before rendering it for the user, and provides options for the user to unblur certain parts of the website or add trusted websites to a whitelist.
How we built it
We built SafeBrowse using Next.js and TailwindCSS on the frontend, along with the Chrome Developers API. When a webpage is loaded, a content script is injected into the webpage which, if the page needs to be filtered, blocks out the content with a loading screen while the page is filtered. The content script sends each relevant element to a Python Flask API, which uses GPT 4o-mini to identify whether that element needs to be filtered per the user's input to the extension.
Challenges we ran into
We had difficulty identifying each piece of text that needed to be filtered -- each section of text could not be too large requiring us to blur out too much content, nor could it be too small and potentially not blur enough content. We had to achieve a balance of necessary filtering without blurring out too much of the web page.
Accomplishments that we're proud of
This was the first hackathon for two of us and the first time we've worked together on a Chrome extension, so we're happy about how efficiently we worked together and troubleshooted problems.
What we learned
We learned a lot about the runtime of various machine learning models and how runtime can be affected by factors like input size, amount of tokens, etc. We had to do a lot of prompt and query optimization to reduce the runtime significantly.
What's next for SafeBrowse
We'll hopefully continue developing the application and publish it on the Chrome web store, so it can be helpful for those who are sensitive to any form of content.
Log in or sign up for Devpost to join the conversation.