Inspiration

This app is inspired by a personal story of one of our team members, whose family friend had 3nd-degree burns all over her body, and unfortunately was incapable of calling 911 on her phone. Her parents were only able to call 911 later, however it was too late for the ambulance to arrive on time and take her to the hospital, without delaying further. She had no choice but to be airlifted, which is extremely costly and this is a high chance she would not have made it in time to the hospital. Additionally, the parents had only completed the minimal care, and had not known how to fully treat burns. This story had taken a strong place in our hearts and we hoped to create solutions to help prevent future incidents like this. Specifically, we hope to utilize the time spent and make the processes to call 911 immediate.

What it does

The app is intended to provide immediate care for the patient. Its target audience is rural areas since ambulance response times are almost double on average compared to urban areas. The first page that comes up has the option of calling 911 and sending medical information to emergency services, viewing medical videos, or signing up/logging in. The app makes it so that it is not possible to misjudge the severity of injuries, patients do not need to have prior medical knowledge or search for said information. OpenCV is utilized to analyze photos of injuries to provide basic information and possible actions that the patient can take while waiting for the ambulance. We emphasized privacy, user-friendliness, and speed. All user information is password protected and personal information is sent only to medical care. Every action in the app is automated and immediate. For example, users only have to click one button, to call for an ambulance and send immediate medical information via the QuickCare app.

How we built it

We built this app using Android Studio, Firebase, OpenCV, Java, and Python. We used Android Studio to create the main app, such as the home page, emergency button, playlist of videos, chat messaging system, and the general UI/UX. We used Firebase for the authentication information and storing the chat messages between the injured users and the Emergency Medical Technician (EMT). We used OpenCV for wound identification, we used Machine Learning algorithms for scanning various images to find the size of the wound, and recommending the optimal bandage size. The OpenCV package was coded in Python and Android Studio is mainly in Java.

Challenges we ran into

As Android Studio was in Java and OpenCV was in Python, we could not directly import OpenCV and code in Python. We had to figure out how to upload a .py file into Android Studio to run. Additionally, it was difficult to find a way to achieve this, as it was uncommon to attach a Python file in Android Studio, so we had to get creative. We eventually had found a way to make this work, however it had taken a long time, as we had multiple errors in Android studio from the Python code. In our app, users take pictures through their phone camera, and must select their wound pictures to be sent to medical professionals in the in-app messaging system. We were able to choose the wound pictures, however we had difficulty sending the pictures in the app. We realized it was a problem with the Firebase platform, and were able to figure out the problem.

Accomplishments that we're proud of

We are proud of learning how to use a variety of languages, learning how to integrate these languages with each other, and having an interesting UI/UX. Specifically, when we ran into errors we were determined to find a solution to them, and we were able to achieve this goal. For example, when we had difficulties moving the selected images into the chat messaging system, but we were still able to find a solution. Additionally, some of the team members who were newer to coding leader basic openCV were able to contribute to the image detection and analytics. Additionally we are very proud of being able to create a UI that incorporates a multitude of medical videos, that injured patients can use to prep their wounds before EMS services arrive. Additionally, we are proud of our teamwork and how we planned this application. We had thoroughly thought about its process and made a detailed plan before starting to work, and we often checked in and coordinated with each other while working, to make sure everybody was on the same page.

What we learned

Throughout this hackathon, we learned many different things such as how to design a picture. This was the first time our team used OpenCV in a project and how to implement it into Android Studio as they were in different languages. We also learned how to use Firebase and plan the flow of how a user will operate the app. This project also helped us improve our research skills and how to quickly determine if the information is useful.

What's next for QuickCare

In the future, we plan on implementing a way to take the patient’s vitals and using that data to determine if 911 needs to be called automatically. Analyzing historical ambulance ETA data can also help determine how much information should be provided to the user. If the user knows too much about the risks, they are more liable to panic which can cause more harm. We plan on implementing a way to analyze non-visible injuries such as strokes and heart attacks, as well as adding more information about these emergency situations to the database of videos. We would also like to fully integrate the chosen pictures into OpenCV and possibly utilize Firebase to save and cast these images.

Share this project:

Updates