Inspiration

Many fear the risks that come with driving, especially on the highway. Driving at high speeds can be stressful for some, and on a place like the highway where numerous crashes happen daily, many may be apprehensive about driving. Our inspiration for creating FenderFinder stems from this fear. We want to limit the amount of risks on the road and alleviate the many fears that many may have when driving.

What it does

FenderFinder is an application that uses a trained AI to detect crashes on the road through a camera. Once something is deemed a crash, FenderFinder uses a GPS to determine any other people within a 5-mile radius that have the app installed; then, the app sends a notification to all those people informing them of the crash with overarching goals of improving traffic flow, encouraging other drivers to drive safely in order to potentially prevent other crashes, and to potentially have legal benefits where our application can provide valuable data for insurance companies and potentially for other legal purposes.

How we built it

We trained an AI model via Google Teachable Machine to determine if there's a car crash or a clear path ahead. The Google Teachable Machine model gave us a Tensorflow Lite file, and from there, we computed all our back-end computing through Tensorflow Lite, and all of our front-end computing through Swift.

Challenges we ran into

After deciding to work primarily on the Toyota challenge, it took quite a while to figure out how to stand out from the competition and come up with a project that was rather unique. We then struggled with our front-end development as Swift would work best for FenderFinder but those of us in our group are only thinly versed in Swift, so attempting to make sense of our front-end development while thinking of ways to connect our front-end to our back-end took a lot of time and mental processing.

Accomplishments that we're proud of

We were able to utilize Swift for our front-end development to a sizable extent despite knowing a minuscule amount of Swift. After learning how to use Swift to develop our front-end, we were able to connect our front-end to our back-end, which did take a significant amount of time, but, considering it was what we were scared about the most, we managed to make it work and look presentable.

What we learned

Thanks to using Google Teachable Machine, we were able to export our trained AI model as a Tensorflow Lite file. Not only did we learn how to train an AI through Google Teachable Machine, we learned how to export models as Tensorflow Lite files, and then we also learned how to use Tensorflow Lite for our back-end development. We also learned many new things about Swift in order to utilize it to our best efforts and create a presentable and functional front-end.

What's next for FenderFinder

We hope for FenderFinder to be incorporated in future Toyota models. Currently, we are using a phone to show pictures of car crashes and pictures of clear paths to the AI model in order to depict its capabilities of recognizing car crashes. However, we hope that Toyota can incorporate cameras in their future models using this same technology to recognize car crashes on the road while also developing an app that can send notifications to everyone within a 5-mile radius based on the presence of a crash.

Built With

Share this project:

Updates