Inspiration
Sketches and 3D models serve as a visual language, providing a powerful means of communication and idea expression. Through these mediums, we interact with the world and convey complex concepts. Sketching is a fundamental tool for enhancing creativity, while 3D models offer a comprehensive understanding of ideas through detailed representation. However, creating realistic drawings and modeling intricate 3D objects can be challenging, particularly for children.
This project introduces DrawN, an interactive interface designed to help users develop and improve their drawing skills while enabling the retrieval of corresponding 3D objects through sketch matching. With DrawN, even a simple sketch or outline allows children to intuitively design and visualize spaces in 3D, leveraging both their creativity and the capabilities of machine intelligence.
What it does
The system consists of input, output, and computational processes designed to interpret user sketches and retrieve corresponding 3D objects with compelling interactivity. DrawN employs a method where users' cognitive or intuitive query sketches are compared with a preprocessed, line-rendered view of 3D objects stored in a database, utilizing HOG feature descriptors for accurate matching. With further enhancements, DrawN has the potential to serve as an assistive tool in education, streamline the design process, enable quick 3D prototyping, and function as an information retrieval system that uses abstract sketching or drawing as an interactive medium.
How I built it
DrawN is comprised of computational processes and a user-friendly interface. The key computational steps are as follows:
Database Construction: The process begins by selecting the most suitable viewpoints and capturing 2D images of objects. Canny edge detection is applied to these images to generate line renderings or line drawings. Each image is then represented in the database using HOG (Histogram of Oriented Gradients) descriptors.
User Sketch Representation and Matching: An abstract user sketch is converted into a HOG feature descriptor. This descriptor is then compared with each line-rendered feature descriptor in the database to calculate a similarity index using cosine distance.
Retrieval of 3D Models: The top 10 matches, based on the highest similarity index, are selected. The system then retrieves the corresponding 3D model from the database by allowing the user to choose from the best-matched line renderings.
The DrawN interface is designed for ease of use, supporting both standard mouse and tablet stylus/pen inputs:
Sketching and Selection: Users start by drawing freeform sketches on a blank canvas. Once the sketch is complete, they can search for the best matches to their initial sketch and select the desired line drawing or 3D model from the database. Alternatively, users can trace one of the best-matched line drawings to practice and improve their sketching skills. After tracing, they can retrieve the corresponding 3D model. The interactive nature of DrawN allows users to refine their query by selecting relevant images from the initial retrieval list, iterating this process until they achieve the desired result.
Interactive Learning and Gaming: DrawN also offers a gaming experience to enhance user engagement and sketching skills. After completing a freeform abstract sketch and tracing, users can redraw the learned sketch on a blank canvas and receive a similarity score out of 100, encouraging continuous improvement.
This system not only facilitates the development of drawing skills but also provides an innovative approach to retrieving and interacting with 3D models, making it a valuable tool for education, design, and creative exploration.
Important Steps in the Backend
The sketch-based 3D model retrieval system consists of three primary components, each varying based on the feature descriptors utilized:
Feature Extraction: This involves converting the abstract user sketch into a geometric feature descriptor that accurately represents the sketch's key characteristics.
Image Database and Feature Storage: A comprehensive database is created, containing 3D models, line drawings derived from these models, and encoded images, all represented using feature descriptors. These are then systematically stored for efficient retrieval.
Similarity Measurement: This component focuses on assessing the differences between the query sketch and the images stored in the database, enabling accurate matching and retrieval.
Accomplishments that I'm proud of
- Developed a Python application utilizing computer vision and graphics libraries.
- Created a versatile interface that can be deployed on any system with minimal dependencies.
- Conducted user interviews during the design phase and validated the prototype through a comprehensive user study.
- Presented the project at the International Conference India HCI.
What I learned
- Techniques and methods in computer vision and computer Graphics
- Application programming in Python
- System Modeling and Human Computer Interaction process
What's next for DrawN
- Implementing sketch classification and enhancement through machine learning using deep neural networks.
- Enhancing the graphical user interface (GUI) and refining the overall user experience and use cases.
Summary
Children's cognitive abilities, particularly in sketching, can vary significantly across different age groups. For this experiment, the target age group is 7-16 years old. The goal is to develop an application that helps users draw more naturally, offering guidance throughout the process and enabling the creation of 3D models from simple 2D sketches. The future potential of this application includes applications in storytelling, education, and the enhancement of children's ideation and visualization skills. With further development, DrawN could serve as an assistive tool in education, support the design process, facilitate quick 3D prototyping and explanations, and function as an information retrieval system that uses abstract sketching as an interactive medium.

Log in or sign up for Devpost to join the conversation.