Inspiration
We were inspired to create Lauder Looks after trying out the Estee Lauder Virtual Makeup Assistant App. We recognized that with a few changes to make it more user-friendly and accurate, the app could be a helpful tool for visually impaired people to experiment with makeup and enhance their appearance. Additionally, we were motivated by our personal love for makeup and the desire to learn how to apply it step by step, as well as determine which colors work well for our faces. This led us to create Lauder Looks as a comprehensive platform where users could explore different makeup routines, experiment with products, and receive personalized guidance and recommendations based on their unique features and preferences.
What it does
Lauder Look is an improved version of the Estee Lauder Virtual Makeup Assistant App, designed to make it more accessible and user-friendly for visually impaired people. The app utilizes AI and computer vision technology to analyze a user's face and provide personalized makeup recommendations based on their skin tone, facial features, and preferences. In addition, the app offers step-by-step tutorials and makeup routines that users can follow to achieve specific looks.
How we built it
As we participated in the Idea Track, we haven't developed Lauder Look yet. However, if we were to develop it, we have a plan. We would use Andriod Studios and XCode for the IDEs and the flutter framework along with the language dart in order to create a great cross-platform application. Integrated into the app, there would be different machine learning models, including computer vision, machine learning, and natural language processing. The app would use a camera to capture images of the user's face, which would then be processed by computer vision algorithms to detect facial features and determine skin tone. Machine learning algorithms would also be used to analyze the user's data and provide personalized makeup recommendations, while natural language processing is used to provide voice-guided instructions and step-by-step tutorials.
Challenges we ran into
One of the main challenges we faced was making the app accurate and reliable for visually impaired users. We had to ensure that the app would have helpful features implemented but we also wanted to make sure that the features would be user-friendly.
Accomplishments that we're proud of
One of our biggest accomplishments was making the app more accessible and user-friendly for visually impaired people. We were able to develop a system that uses voice-guided instructions and tutorials to help users apply makeup step by step, even if they cannot see the screen. Additionally, we were able to integrate personalized makeup recommendations based on skin tone and facial features, which can help users find the right products and colors for their unique needs.
What we learned
Developing Lauder Look taught us a lot about the challenges of building technology that is accessible and user-friendly for everyone. We learned how important it is to consider the needs of different users and to design interfaces that are clear and easy to understand. Additionally, we learned about the power of AI and computer vision technology in providing personalized recommendations and assistance to users.
What's next for LauderLook
In the future, we plan to develop and release the first version of LauderLook. After that, we can continue improving LauderLook and expanding its capabilities. We would integrate more advanced computer vision algorithms and machine learning models to provide even more personalized recommendations and tutorials. Additionally, in the future, we can explore ways to integrate the app with other devices and platforms, such as smart mirrors or virtual reality headsets, to provide a more immersive and interactive experience for users.
Built With
- canva
- figma

Log in or sign up for Devpost to join the conversation.