Princeton University researchers have developed an editing tool that can correct distortions in self-portrait photographs by manipulating a digital image to make a subject's face appear as if it were photographed from a longer distance or a different angle.
The researchers developed a model that generates digital three-dimensional (3D) heads combined with a program that identifies more than 70 facial reference points. The 3D head is adjusted to correspond with the points corresponding to a two-dimensional image, so a selfie's facial reference points then can be modified to approximate changes in the 3D orientation.
The synthetic image looks realistic because the exact pixel colors from the original images are still present.
Before pursuing commercial development, researchers will focus on perfecting the tool's adjustment of hair, which often looks contorted in the synthetic images because of varied hair texture and color.
Moreover, body features that are not visible in the original picture would appear to be missing or distorted in the altered pose.
"As humans, we have evolved to be very sensitive to subtle cues in other people's faces, so any artifacts or glitches in synthesized imagery tend to really jump out," notes Princeton professor Adam Finkelstein.
The work was presented this week at the ACM SIGGRAPH 2016 conference in Anaheim, CA.
From Princeton University
View Full Article
Abstracts Copyright © 2016 Information Inc., Bethesda, Maryland, USA
No entries found