Inspiration
The inspiration comes from a lack of mental health assistants and an increase in work from. There are many mental health guides around the web, but smart assistants lack the features to focus on mental well being on a daily basis. Work from home is something that has drastically increased in the past year. Many have found it rewarding to work from home while others have struggled. This project aims to tackle the mental well being challenges of working from home by creating a personalized assistant that emphasizes well being.
What it does
This project allows the user to track To Dos and Goals using lists. It also track the user's mental well being through a well being assessment. This allows the assistant to know the well being state of the user and provide actionable recommendations. This app can be used through a web browser or by installing on a computer.
How we built it
We built this app using React as the front end. Node JS and Firebase as the backend. We also used Python to handle to desktop version.
Challenges we ran into
The main challenge we ran into was the ability to use open source. We wanted to integrate facial emotion recognition however, it was challenging to build/train an emotion model in 24 hours. We decided to scrap the feature as the open source solutions we found were too large to integrate without violating hackathon rules. The second challenge is time difference as one team member is temporarily located Slovakia, this meant we we had time management difficulties.
Accomplishments that we're proud of
The main accomplishment is being able to get working software despite the fact that we are located in different countries and with a 7 hour time difference. We used our diverse skills to work on a web and desktop version that is robust.
What we learned
In addition, we both learned about APIs to communicate between platforms. One team member learned how to make NodeJS backend and connect to Firebase. Additionally, we learned how to query the database. We also learned Firebase and Python authentication. We also learned facial emotion recognition using OpenCV and existing open source projects.
What's next for Well-being Buddy
The feature we are most interested in building is the facial emotion detection to allow the assistant to give more precise and automated well-being recommendations. We would like to train and create our emotion detection model. In addition we would like to implement an object detection model to detect what the user is doing to given even further well-being recommendations. Some of these features have been built however they leverage existing open source projects. We would also like to make a mobile version to allow the assistant to travel with the user.


Log in or sign up for Devpost to join the conversation.