Inspirationπ‘π‘π‘
During and after the COVID-19 Pandemic, the need for online communication skyrocketed and that came with virtual meetings. However, we realized that it was particularly hard and difficult to recognize certain emotions of your friends or peers in a call as the experience just isn't the same as it is in person. This would also be a growing challenge to those who have trouble recognizing emotions like those with Autism as online communication could pose greater difficulty to recognize emotion. Hence, we decided to develop a Google Chrome Extension that would recognize the emotion of your friends to make it easier for more effective and clearer communication.
What it doesπ»π»π»
Emoji Meet is a Google Chrome Extension that displays the emotion of another person in a Google Meet call using computer vision. Their emotion is displayed through a corresponding emoji in a small popup. The popup updates the emoji it displays automatically based on the expression/emotion of the person the user is calling.
How we built itπ π π
Emoji Meet uses Google Cloud Vision API to determine which emotion the expression of the person in the video corresponds to. The data is stored in a MongoDB database and sent to the React front end. The front end uses the emotion it gets from the database to display an appropriate emoji on the screen. The backend was made using Flask, which was used to send HTTP requests to and from the backend server as well as the Google Cloud API.
Challenges we ran intoπ»π»π»
One of the major challenges we ran into was parsing the image data received from the front end and formatting it so it can be used by Cloud Vision API. The data received is in base64 format which we decode into bytes and place in a jpg file. The process to figuring out this approach involved a lot of time and frustration.
Accomplishments that we're proud ofπππ
For both of us, this is one of the first hackathons where a project is actually functional. Incomplete and broken code were submitted for many previous hackathon. We're proud that we accessed our skill level and decided on an achievable project idea and executed it well. We believe that our primary goal was accomplished and that this idea can be readily used by anyone!
What we learned πππ
Since one of us comes from a front end background and the other a back end background, we were both teaching and learning from each other throughout the entire hackathon. In the front end, React Hooks such as useState and useEffect were learned and integrating the Vision API and using MongoDB as a database was learned about the back end.
What's next for Emoji Meet π€π€π€π€π€
Emoji Meet could have more advanced features such as displaying the emotion of multiple people in the Google Meets call as it only supports one person currently as well as different types of attributes of a person such as whether they are wearing a hat. Since this is a Google Chrome Extension, publishing Emoji Meet to the Chrome Web Store where the public can use this extension is not far-fetched at all.


Log in or sign up for Devpost to join the conversation.