Inspiration

Online harassment affects women and creates unsafe digital spaces. We wanted to build a proactive tool that protects users in real time, helping them feel safer, more comfortable, and more in control while browsing online.

What it does

SafeSpace is a browser extension that uses artificial intelligence to detect and filter toxic, abusive, and threatening content in real time. It automatically blurs harmful text, shows warnings, and allows users to reveal content only if they choose. Users can customize sensitivity levels and manage protection settings for a safer browsing experience.

How we built it

We built SafeSpace using HTML, Tailwind CSS, and JavaScript as a Chrome extension. We integrated Perspective AI for real time toxicity detection and used dynamic content scanning to analyze comments, posts, and messages as they load. We optimized performance using debouncing and batching to ensure smooth and fast operation.

What we learned

We gained experience in browser extension development, AI integration, real-time content processing, and performance optimization, as well as a deeper understanding of building ethical, user-focused AI solutions.

What's next for SafeSpace

We plan to expand to more browsers, improve AI accuracy, add multi language support, and introduce features like evidence capture and safe reply assistance.

Built With

Share this project:

Updates