ColourSense

Inspiration

Since beginning our journey in Hack The Hill, our team were looking to apply Computer Vision and Machine Learning using open-source libraries to our submission. After being introduced to the MakerCon categories, we knew that we wanted to work towards creating a more convenient solution for color blind and the elderly people. We drew from the colour blind corrective glasses.

What it does

ColourSense uses AI to determine accurate colour detection for any items from the camera. We also have a range of colourfully descriptive words to help the users to have better understanding of the colour.

How we built it

We used React Native as we wanted a cross platform application available to both iPhone and Android users. We used Flask and OpenCV in the backend to incorporate all the central features in the backend, allowing for a performant but simple solution.

Challenges we ran into

The biggest challenge for us was learning React Native for the first time. As the entire team had never utilized said framework. It was also surprisingly hard to detect the colours on the screen and get an accurate description of every unique colour.

Accomplishments that we're proud of:

  • Learning to use React Native within 36 hours
  • We all pushed ourselves to push the app out in time
  • This is one of the first times we’ve used mobile development
  • First time working with Computer Vision

Built With:

JavaScript, Python, React Native

Share this project:

Updates