Inspiration
With the increased focus on racial and demographic disparities in violence and police violence on online platforms, our goal was to showcase how bias in AI may be affecting this issue. We found this study by the Department of Justice, in which machine learning was used to forecast murder. A parallel can be drawn to a model that predicts victims rather than perpetrators, and with historical victim demographic data being used as training data, it is clear how a bias would drastically affect the outcome of the model.
What it does
Our website allows the user to gain an insight into how AI would predict the likelihood of them becoming a victim of a police shooting or homicide according to their demographics. Multiple visualizations are also included in the website, which presents them with graphs of where exactly they fall in relation to the rest of our stored data.
How we built it
We first did exploratory data analysis on the datasets gathered, those being from Washington Post and US Census datasets. This allowed us to clean the data, and we could then calculate the probabilities of homicide based on gender, age, race, and location, as well as the likelihood of someone being affected by a police shooting (based on the same factors). This was done by creating a probability function that found the conditional probability of an incident given a set of attributes. This logic was then ported over to JavaScript/svelte front-end to represent all this data and allow the users of the website to find out where they fall under according to their demographics and our calculated probabilities.
Challenges we ran into
Initially, we were thinking of doing a full-fledged machine learning model using PyTorch or TensorFlow. We also tried to use a perceptron algorithm to find the weights of certain demographic variables. Eventually, we realized our datasets did not have the data required to use a fully fledged machine-learning algorithm, so we used Bayesian probabilities, and found the results to be highly accurate.
Accomplishments that we're proud of
We are very proud of how we worked through the many challenges we faced and used conditional probability to allow users to understand out how AI would predict their risk of being a victim of a homicide or a police shooting from their demographics, all on an easy-to-use, interactive website.
What we learned
We learned a lot about conditional probability and how biased demographic data would affect AI. If a machine learning model were to learn from this data and its outcomes, it would likely come out as biased because it would treat the correlation present between race, gender, location, age, and likelihood to die by homicide or be affected by a police shooting as causation, when this is not always the case. With the focus on racial disparities, for example through the BLM movement, biased AI could negatively impact the quality of information on the matter circulating on online platforms.
What's next for the Precog Project
We would want to expand the data that we analyze further to encompass different topics, such as the likelihood of more tragedies than police shootings and homicides.
Log in or sign up for Devpost to join the conversation.