Project Summary

MyDetective is an AI-enhanced interrogation assistant designed for the nuanced needs of security interviews, showcasing its innovation at natHacks 2023. At its core, the system integrates several AI technologies for real-time analysis and dynamic interaction within interviews. The backend, architected in Python, leverages the Brainflow library to interface with openBCI hardware, capturing EEG data that is then processed through a random forest model to assess veracity. The interrogation process is powered by OpenAI's GPT-4, which generates context-sensitive questions informed by the interviewee's background information. This is a key component of the program's flow, ensuring that each question is strategically crafted to elicit meaningful responses. Frontend development utilizes React.js to present a responsive web interface, displaying complex emotional and EEG data analyses in an accessible format for interviewers. Real-time data handling and communication are facilitated by Socket.IO, providing a responsive and synchronous user experience. Amazon EC2's robust infrastructure underpins the server setup, ensuring reliable and scalable application deployment. MyDetective's codebase is designed for modularity, allowing for the independent evolution of each component—facial expression analysis with DeepFace API, speech emotion classification, and EEG lie detection. This modular approach ensures that as new advancements in AI are made, MyDetective can be swiftly updated, maintaining its cutting-edge status in the field of AI-assisted interrogation tools. By integrating these technologies, MyDetective creates a multi-dimensional analysis framework that significantly enhances the accuracy and psychological insight of security interviews, setting a new benchmark for interrogation tools.

Learning Experience

Embarking on this project, we delved deep into the realms of artificial intelligence, emotion recognition, and speech analysis. We learned to integrate various APIs such as DeepFace, Google's speech recognition, and OpenAI's GPT-4, each serving a unique purpose in our application. The complexity of emotional intelligence in AI and the ethical considerations of lie detection were areas that significantly broadened our understanding and perspective.

Building MyDetective

The construction of MyDetective was a journey of merging diverse technologies. We started by implementing facial expression analysis to interpret emotions during interviews. Integrating Google's speech recognition and the MoritzLaurer/DeBerta-v3-base-mnli-fever-anli model allowed us to classify speech into emotions effectively. The most challenging part was analyzing EEG data for lie detection, which involved sophisticated data interpretation using a random forest model. The brainchild of our project was the dynamic use of GPT-4 for generating context-aware questions, making the interrogation process adaptive and intelligent.

Challenges we ran into

Our path was not without obstacles. One of the major challenges was ensuring the seamless integration of different technologies to work in harmony. Analyzing EEG data accurately while considering ethical boundaries was a complex task. Balancing the technical aspects with the humane side of interrogations, ensuring our tool aids but does not dominate the human element, was a critical aspect we continuously worked on.

What's next for MyDetective

Looking ahead, we are excited to take MyDetective to the next level. One of our primary goals is to enhance the emotional analysis component by incorporating voice tone analysis. By analyzing the nuances in an interviewee's tone, we aim to gain deeper insights into their emotional state, adding another layer to our comprehensive interrogation tool. We believe that this advancement will not only improve the accuracy of emotion detection but also enrich the overall interrogation process, making it even more adaptive and intuitive.

Built With

Share this project:

Updates