Inspiration
The inspiration for our hackathon idea stemmed from an experience observed by one of our team members who had recently been to the hospital. They noticed the numerous amount of staff required at every entrance to ensure that patients and visitors had their masks on properly, as well as asking COVID-19 screening questions and recording their time of entry into the hospital. They thought about the potential problems and implications that this might have such as health care workers having a higher chance of getting sick due to more frequent exposure with other individuals, as well as the required resources needed to complete this task.
Another thing that was discussed was about the scalability of this procedure and how it could apply to schools & businesses. Hiring an employee to perform these tasks may be financially unfeasible for small businesses and schools but the social benefit that these services would provide would definitely help towards the containment of COVID-19.
Our team decided to see if we could use a combination of Machine Learning, AI, Robotics, and Web development in order to automate this process and create a solution that would be financially feasible and reduce the workload on already hard-working individuals who work every day to keep us safe.
What it does
Our stand-alone solutions consists of three main elements, the hardware, the mobile app, and the software to connect everything together.
Camera + Card Reader The hardware is meant to be placed at an entry point for a business/school. It will automatically detect the presence of a person through an ultrasonic sensor. From there, it adjusts the camera to center the view for a better image, and takes a screenshot. The screenshot is used to make an API request using Microsoft Azure Computer Vision Prediction API where it can be used to return a confidence value of a tag. (Mask / No Mask) Once the person is confirmed to be wearing a mask through AI, the individual will be prompted to scan their RFID tag. The hardware will check the owner of the RFID id and add a time checked-in or out for their profile in a cloud database. (Firestore)
Mobile Application The mobile application is intended for the administrator/business owner who would like to be able to manage the hardware settings and observe any analytics. _ (We did not have enough time to complete that unfortunately) _ Additionally, the mobile app can also be used to perform basic contact tracing through a API request on a custom-made Autocode API that will check the database and determine recent potential instances of exposure between employees based on check-in and check-out times. It will also determine those employees affected and automatically send them an email with the dates of the potential exposure instances.
The software Throughout our application, we had many smaller instances of programming/software that was used run our overall prototype. From the python scripts on our Raspberry Pi to communicate with the database, to the custom API made on Autocode, there were many small pieces that we had to put together in order for this prototype to work.
How we built it
For all of our team members, this was our first hackathon and we had to think creatively about how we were going to make our idea into a reality. Because of this, we used many well-documented/beginner-friendly services to create a "stack" that we were able to manage with our limited expertise. Our team background came mainly from robotics and hardware so we definitely wanted to incorporate a hardware element into our project, however we also wanted to take full advantage of this amazing opportunity at Hack The 6ix and apply the knowledge that we learned in the workshops.
The Hardware In order to make our hardware, we utilized a Raspberry Pi and various sensors that we had on hand. Our hardware consisted of an RFID reader, Ultrasonic Sensor, Servo Motor, and Web Camera to perform the tasks mentioned in the section above. Additionally, we had access to a 3D printer and were able to print some basic parts to mount our electronics and create our device. (Although our team has a stronger mechanical background, we spent most of our time programming haha)
Mobile Application In order to program our mobile app, we utilized a framework called Flutter which is developed by Google and is a very easy way to rapidly prototype a mobile application that can be supported by both Android and iOS. Because Flutter is based on the DART language, it was very easy to follow along tutorials and documentation, as well as some members had previous experience with Flutter. We decided to also go with firestore as our database as there was quite a lot of documentation and support between the two applications.
Software In order to put everything together, we had to utilize a variety of skills and get creative with how we were going to connect our backend considering our limited experience in programming and computer science. In order to run the mask detector, we first used some Python scripts on a Raspberry Pi to center our camera onto the object and perform very basic face detection to determine whether to take a screenshot or not in order to send to the cloud to be processed. We did not want to stream our entire camera feed to the cloud as that could be costly due to a high rate of API requests, as well as impracticality due to hardware limitations. Because of that, we used some lower end face detection in order to determine whether a screenshot should be taken and from there we send it through an API request through Microsoft Azure Services Computer Vision Prediction API where we had trained a model to detect two classifiers. (Mask and No Mask). We were very impressed with how easy it was to set up the Azure Prediction API and it really helped our team with reliable, accurate, and fast mask detection.
Since we did not have much experience with back-end in flutter, we decided to utilize a very powerful tool which was Autocode which we learned about during a workshop on Saturday. With the ease of use and utility of Autocode, we decided to create a back-end API that our mobile app could call basically with an HTTP request and through that our Autocode program could interact with our firebase database in order to perform basic calculations and achieve the basic contact tracing that we wanted in our project. The autocode project can be found here!
Challenges we ran into
The majority of our challenges that we ran into was due to our limited experience in back-end development which lead us with a lot of gaps in the functionality of our project. However, the mentors were very friendly and helpful and helped us with connecting the different parts of our project. Our creativity also aided in helping us connect our portions together.
Another challenge that we ran into was our hardware. Because of quarantine, many of us were at home and did not have access to lab equipment that could have been very helpful in diagnosing most of our hardware problems. (Multimeters, Oscilloscopes, Soldering Irons). However, we were able to solve these problems, all be-it using very precious hackathon time to do so.
What We learned
-Hackathons are very fun, we definitely want to do more! -Sleep is very important. :) -Microsoft Azure Services are super easy to use -Autocode is very useful and cool
What's next for Unmasked
The next steps for Unmasked would be to further add to the contact tracing feature of the app, as knowing who was in the same building at the time does not provide enough information to determine who may actually be at risk. One potential solution to this would be to have employees scan their Id's based on location as well, enabling the ability to determine whether any individuals were actually near those with the virus.
Built With
- 3d-printing
- autocode
- azure
- computer-vision
- custom-api
- firebase
- firestore
- flutter
- iot
- javascript
- machine-learning
- opencv
- python
- raspberry-pi
- solidworks
- ui
Log in or sign up for Devpost to join the conversation.