Inspiration

Seeing the role of toxic chat messages, especially in online gaming communities, we wanted to explore how we could prevent individuals from unwillingly seeing hurtful messages, knowing the devastating psychological impacts of cyberbullying.

What it does

A Discord(chat app) bot is deployed that reads in messages from users in a chat and feeds it to a deep learning-based backend, which performs natural language processing to predict whether a certain message is of toxic or bullying in nature. If this is the case, the bot is able to erase, almost instantaneously, the message from the chat history, notifying the moderator. Messages that are considered toxic are logged for further use by the moderator.

How we built it

We used a Discord API to build a bot that scrapes text messages. We then built a neural network classifier based on datasets of toxic sentiment and sarcasm to develop a model that classifies messages as toxic or not. These two were combined into a smooth framework.

Challenges we ran into

We had to adapt our models/architecture based on our preliminary results. For example, the “Sarcasm” feature of the model was inspired after trial and error with our original, simpler model. Additionally, we had to do constant hyperparameter (architecture, learning rate, depth, batch sizes, etc.) tuning in order for the model to achieve the level of accuracy that we reported.

Accomplishments that we're proud of

We were able to design a neural network model, testing it with real-world data with a "research-oriented" mentality in order to then find a new feature to implement in our final product. We were also quite proud of transitioning through results that were not "expected," which was somewhat of a novel idea, particularly with machine learning prediction. Never before have we had to completely rethink our model design based on our own research.

What we learned

We discovered the importance of tuning models beyond simply hyperparameters is important in finding large jumps in model performance. Ultimately, this encouraged us to continue thinking more broadly about the "larger picture" and not being afraid to rethink our preconceptions about data unlocks a whole world of modeling possibilities.

What's next for Detox

We wish to perform integration of Detox with other public API's, such as Facebook. Being able to ultimately integrate an idea with this with video games and social media platforms is something which could be essential to curbing the effects of cyberbullying.

Built With

Share this project:

Updates