Inspiration

In recent news, Amazon announced that they would discontinue an AI recruiting tool that demonstrated bias against women. It is convenient to blame bias on algorithms, so we wanted to pursue a project that would uncover other sources of bias in the hiring process. We created resumesai to help resume writers and resume readers reflect on the biases around them.

What it does

We collected 25 anonymous resumes from Boilermake participants and created a data collection portal where recruiters, hiring managers, and engineers could indicate whether or not they think that student would make a good software engineer. Six sponsors participated in our data collection.

We also captured data for 75 features by manually extracting information from the resumes. We used machine learning techniques (decision trees and association rule mining) to model the bias that these reviewers may have exhibited in their decisions. These models, in addition to context about bias already present in the resumes and comparisons to other reviewers, can help people reflect on their own process.

How we built it

We followed this data science project methodology:

  1. Read anonymous resumes and brainstorm features.
  2. Get feedback from domain experts.
  3. Manually retrieve data from resumes.
  4. Collect training data from users.
  5. Describe dataset and select model features.
  6. Model user decisions.
  7. Get feedback from users.

Challenges we ran into

choosing appropriate metrics going through all the stages of a data science project manually extracting data from anonymous resumes

Accomplishments that we're proud of

What we learned

What's next for resumesai

Built With

Share this project:

Updates