Headline
Effective communication using current video conferencing systems is severely hindered by the lack of eye contact caused by the disparity between the locations of the subject and the camera.
• We present a gaze correction approach based on the mobile Camera that preserves both the integrity and expressiveness of the face as well as the fidelity of the scene as a whole, producing nearly artefact-free imagery.
• Our solution is based on the two-approaches: 1) Software Approach and 2) Hardware Approach.
• Our method is suitable for mainstream video conferencing: it uses inexpensive consumer hardware, achieves real-time performance and requires just a simple and short setup using mobile application.
• For our application, it is sufficient to synthesize only the corrected face.
• Thus, we render a gaze-corrected 3D model of the scene and, with the aid of a face tracker, transfer the gaze- corrected facial portion in a seamless manner onto the original image. The solution can be incorporated into existing systems without any modifications
• To further improve the reliability and the consistency of the system, the hardware solution can be incorporated in the upcoming smartphones.
• To achieve maximum stability of the model, Voice Coil motors are incorporated to our camera module. These edge motors are commanded both by gyro sensor and the computed patterns generated by the Iris movement of the user.
Log in or sign up for Devpost to join the conversation.