Inspiration
Each one of us is captivated by the rainbow and its seven colors but have you wondered if you could able to see just 2-3 colors out of the rainbow or maybe just greyish-white colors? Yes! this is how around 300 million people which is approximately 1 in 200 women and 1 in 12 men in the world with color vision deficiency look at the world.
Now imagine you might need more clarification about your looks because of color deficiency, how you put your makeup on, and what lip shade is best matching your skin tone or your outfit. We bring you the EL Wave to narrow down the problem of choosing the makeup according to skin tone.
Why wave? because sound and light share the fundamental nature of vibration and we got inspired by approaches that researchers took to identify the Sound-Color relationship over the centuries. Therefore, we have thought of an out-of-the-box correlation between color and sound waves that can help color-blind people to feel the colors like never before. Through EL Wave anyone could feel the colors, the sound of their skin tone, and the sound of their best-suited lip shade by just listening to the correlated frequency.
What it does
EL Wave is an interactive web application to communicate with color-blind people and get personalized lip shade colors.
Here will be the experience: Once the user land on our website, there will be questions on the kind of look they would prefer. Later, a real-time picture will be taken of the user to find out the skin tone, which could lie under 6 categories of skin color. once the skin tone is detected personalized shades will be recommended. Along with the personalization we wanted the user to actually feel the color they going to shop for further.
Users could try out the wave feature to listen to the sound of the colors and the recommended lip shade.
How we built it
Development
Most of the business logic of the application is written on the client side of the application using a lightweight javascript library, React. Although, this application utilizes the python based Azure Function, which we have written to identify the skin tone of the person using libraries such as OpenCV and scikit-learn. Image in the base64 format taken from the client app is passed to the azure function that performs extractions of the pixel corresponding to the skin color range and then identifies the color information. Later, we are using the K-means clustering algorithm to cluster the pixel on their RGB values. Based on the result we get in the form of RGB values we identify the skin tone lies under 6 major categories.
Lipstick Finder - We have created an easy-to-use user form that asks simple questions from the user about their product preferences and identifies the skin tone by exploiting the backend API to recommend the best suitable lipstick shades to the users. As a part of this hackathon, we fetched a few products from Estee Lauder’s shopping website based on their uses and colors to build our dataset. Out of this dataset, we recommend products to the users.
Play Colors - An out-of-the-box correlation between color and sound wave that can help color-blind people to feel the colors like never before. Sound is based on vibrations of air molecules as a moving compression wave, and light (and hence color) is based on an electromagnetic wave. While “frequency” is a measure commonly used for both compression and electromagnetic waves, the two types of waves are quite different.
Colors of light are pure frequencies that our eyes perceive as a single color. The RGB (red, green, blue) color system used by HTML and as displayed on most color monitors uses a blend of three pure light sources (a red gun, a green gun, and a blue gun, in the case of the older CRT displays) to create the impression of a single color to our eyes. In the RGB system, our eyes perceive some colors that do not exist as pure colors of the spectrum, such as pink and white. These colors are blends of multiple colors from the pure spectrum. The RGB color model is called an “additive" color system, because it is adding colors together to render a perceived color.

To approximate these colors in the RGB color model, we have tried to identify the dominant color band from the RGB color by finding out the Hue degree after converting that into HSL format. After that, the wavelength for the dominant color is to find out by using the algorithm from Dan Bruton’s Science color page. Using that wavelength we were able to identify the frequency of the light wave using the below correlation.
Frequency = speed of light/wavelength
The frequency of the light wave is further modulated to get the corresponding frequency of the sound wave using the below formulae.
Frequency of sound = 440 * (2**((Frequency of light - 650)/140))
Deployment
Deployment of the compliments is done over the Azure cloud. The client-side application is deployed as a static website on the storage account and the backend API is deployed as a serverless HTTP-triggered Azure Function communicating with the client application through REST POST API.

Challenges we ran into
The major challenge for this hackathon was to understand the problems from people with disabilities points of view and what could be the major challenges they face in the cosmetic industry. After a lot of research within a limited timeframe and with amazing team collaboration, we are able to submit the idea to this hackathon.
Accomplishments that we're proud of
When we heard about this hackathon on Devpost, we got really excited to develop accessible and inclusive solutions for people with disabilities. We are glad we are able to pull off a product that not only empowers color-blind people while making decisions but also enhances their shopping experience.
What we learned
This is the first time we worked on a tech product related to the cosmetic industry. With the increase in our dependency upon technology, it also become important to create solutions that empower everyone to partake in what the world has to offer. We learned how we can exploit technology to enhance the shopping experience for people who feel difficulties while comparing the color of cosmetic products.
What's next for EL Wave
With the interest of time available for the submission, we could focus on only one cosmetic product, ie, lipsticks. In the future, we would like to extend this project's scope to products that people face difficulty choosing due to color deficiency. Also, it would be a great opportunity to standardize the conversion factor of the color to the frequency of sound waves and make them more accurate. We know this would be a daunting task to achieve, especially finding the accurate correlation for RGB color bands. But with the research, maybe we can achieve something, that can help people with disability to feel the colors and makes them empowered to see the world in a whole different way.

Log in or sign up for Devpost to join the conversation.