Inspiration-
Mental health conditions have detrimental effects on a person’s lifestyle, and can distort a person’s self identity. Especially in a university setting, adolescents face many sources of stress- such as academic pressure, career goal-setting, lifestyle and life choices, etc. This can lead to substance abuse problems, isolation/loneliness, and many more issues.
Colleges recognize this and have since dramatically increased the resources that they provide to students. Examples: Northwestern University has a stress coping app called 'Breathe'. University of Pennsylvania offers a program called 'I Care' that trains students as well as faculty to intervene with resources and education when someone is distressed. However, stigma around discussing mental health problems, fear or shame of seeking them out, or even simply being unaware of all the available on-campus and local resources, results in a steep difference between resources available for students and resources used by students. Above all, university students feel overwhelmingly lonely on their journey through school- especially regarding mental health, and often shy away from help or reaching out.
We aim to combat this with a web application that can provide two solutions:
- Provide an anonymous outlet where users foster a positive community through words of affirmation, and offer people a safe space to feel validated by removing stigma around sharing personal mental health experiences.
- Connect users with their local resources, and increase accessibility to information 24/7.
We believe that a platform that acts as a bridge between the local community along with its resources and the student’s mental health is vital in our schools' efforts to improve mental healthcare. Our goal is to incorporate technology to create a form of social media that can reduce the isolation escalating between the younger generation and their community.
What it does:
Users create a profile that keeps them anonymous to other users. Users can create posts to share mental health and management issues they are facing, to vent about their problems in a safe and positive space, and to not feel embarrassed or ashamed about it. They can choose to identify with any of a pre-existing list of tags (such as 'anxiety', 'depression', 'teststress', etc). These tags are responsive, and users can view and follow other profiles that post with these specific tags. They can also view pages comprised solely of posts containing that tag.
Other users can validate anyone else's post with one of four pre-existing affirmations ('I can relate', 'This helps me', 'You inspire me', and 'Thanks for sharing'). These messages create a powerful sense of community and belonging, and encourages people to share their feelings and lowers both perceived and actual stigma. Vera seeks to bring people together by showing them that, at peoples' lowest moments, they are not alone. Users can also view how many people were able to relate to their post for each of the affirmations.
If users find comments that seem inappropriate they can flag them, and the post is removed from the user’s newsfeed. With enough flagging, moderators can remove the post altogether. Users also have information on how to access local resources.
- This is a tool for self help and connecting resources, not a crisis hotline. No chatting or free responses to others' posts, only validating.*
anonymity #selfhelpselfcare #community
The application can be used in 3 easy steps.
- Step 1: Create anonymous profile.
- Step 2: Draft a post and use pre-existing tags about issues specific to your situation.
- Step 3: Post, view posts with similar issues (through the pre-existing tags) or validate others through the 4 phrases of affirmation, or view local resources or learn more information about conditions, etc.
How we built it:
The core functionality of our application is a platform where users can log on, create, and see each others' posts. This was developed via a React.js front end, uses an Express.js server and Sequelize for data storage and manipulation on the back end, and Okta and Express.js for user authentication. We also developed an interactive prototype model via Adobe XD, a tool that is part of the Creative Cloud Suite. This helped us flesh out and solidify what we wanted our project to finally look like, and also provided a great sense of our future plans for this project once the hackathon ends.
Challenges we ran into:
We originally planned to use the MERN stack to create our prototype, but because mLab was recently acquired by MongoDB, we could not create an account with it or proceed with that implementation. We began writing our current prototype soon afterwards and ran into an authentication problem midway through our implementation. The authentication portion of the login is the only portion holding our current prototype back, and we are very close to solving the issue. Along the way, we resolved plenty of local dependency issues with our machines (for both projects). We also ran into some issues with bootstrap models and incorrect network calls, which also ultimately ended up being the result of incorrect dependency versions.
Accomplishments that we are proud of:
We are proud of having gotten very close to creating a core working prototype. We are also very proud we got the chance to educate ourselves about current resources available for mental health, and trying to integrate those solutions into our project.
What we learned:
We learned how to brainstorm ideas, and learned that in order to bring it to fruition, there were many challenges and aspects that need to be fully thought out. From a business model perspective, to actually coding a viable product, we kept trying to trouble shoot to problem solve. For example, we initially wanted to merge two ideas different implementation ideas with similar intent : one was our vera app, and the other one was an idea to have licensed professionals that would be onsite the app/website to provide support. While the core intent for both ideas was to help an organization provide its community with tools to fight mental health illness, we realized that the implantation style were too different. We didn't want the quality of the intent of the app to be compromised, and so we separated the ideas. We were also able to learn about current apps or products in the market, and were forced to analyze what was unique about our idea, and how it can support existing ideas, so that it can work the way it was intended for. We also learnt how to look at our product from the business model perspective, and realized some of the real life challenges in terms of healthcare policies, and insurance policies. We also learnt a bit of how not to spend time coding, and we are still trying to find a better coding solution.
What's next for Vera:
First we plan to fix the authentication bug. We also plan to incorporate more features into the affirmations side of the app- small branded items such as a customizable mascot that sends positive messages and quotes at certain times throughout the day or usage of the app, and counts and "celebrates" how many affirmations a user gives and receives. We plan to improve the app's capacity to provide the most amount of data possible to the user, so they can track their feelings and mental health status. This could be in the form of a mood questionnaire at the beginning and end of each session using the application. This data could then be compiled into a chart accessible from the settings panel. This not only provides a person more insight about themselves, it can also quantify the positive effects of Vera's affirmations, or other life changes. It can prove to someone that adopting certain patterns in their life, whether it is an application or a new habit, can help them make a tangible and real change in their own health and life. It can also encourage people to seek out professional help if they need it, after positive encouragement from their surrounding community. In terms of malicious usage, we have also given thought to how we will minimize the damage from Internet trolls. We plan to create a flagging system where users can flag threatening, offensive, or concerning posts. An admin can then review any post that receives over a certain number of flags.
Log in or sign up for Devpost to join the conversation.