IGNORE ERRORS IN THE IPYNB -- RUNTIME CONNECTIVITY ISSUES WHEN SHARING -- IF NEEDED I WILL SUPPLY DEMO

Inspiration

Watching the film, "The Wild Robot" in which the main robot learns to comprehend emotions, learn how to speak the language of animals, and help raise a goose has inspired me to create an AI app just like the wild robot and can detect emotions.

What it does

Empathis is an AI project focused on facial keypoint detection and analysis, and using this can detect emotions when given photos of a person.

How we built it

Pandas and NumPy for data manipulation TensorFlow and Keras for deep learning model development OpenCV (cv2) for image processing Matplotlib and Seaborn for data visualization Slideshow explaining process / Calculations on the Empathis AI.ipynb ** link **-- For Slideshow

Challenges we ran into

Accurately detecting facial keypoints across diverse images with varying lighting, angles, and facial expressions Handling missing or incomplete data in the dataset Creation of the residual network/shaping and metrics of it Optimizing model performance and accuracy while maintaining computational efficiency

Accomplishments that we're proud of

I successfully implemented a facial keypoint detection system using advanced deep-learning techniques, and its high accuracy rate makes me even happier. Images of Successful compilation provided below

What we learned

  1. Working with image data and computer vision techniques
  2. Implementing and fine-tuning deep learning models for facial analysis
  3. Data preprocessing and handling large datasets
  4. Collaborative development using Jupyter notebooks and version control

What's next for Empathis

  1. Developing a user-friendly interface or API for easy integration into other applications
  2. Exploring real-time facial keypoint detection for video streams
  3. Add a user-friendly UI for interaction

Built With

Share this project:

Updates