Why FaceApp's Selfie Filters Work So Well and Why They Don't

Over the past few weeks, Facebook and Instagram feeds have been flooded with creepily realistic facial transformations produced by “FaceApp,” a free app for iOS and Android whose filters add smiles to photos, alter faces to make them older or younger, and even make them “male” or “female.” The app’s startling believability is made possible by neural networks, a kind of artificial intelligence where a computer learns by analyzing thousands, sometimes even millions of example images with known traits.

There are already tons of applications of neural networks in our everyday lives, and face recognition technology has also been around for a while. But with FaceApp, an AI has now demonstrated it can perceive human faces in much the same way we do. We don’t know all the details of how it works. (Gizmodo reached out to FaceApp for comment on this piece and has not yet heard back.) But theory and clues the developers have offered up about their technology both suggest the algorithm is keying in on the same traits we use when judging the relative age, femininity or masculinity of faces.

The secrets to youth and aging

Perceptions of facial age can be boiled down to a few key changes, according to Richard Russell, associate professor of psychology and the director of the Perception Lab at Gettysburg College, who was not involved in FaceApp’s development. We all know the obvious cues to old age—wrinkles, age spots, sagging skin, and greying hair, for instance. In addition to these, FaceApp’s algorithm makes more subtle changes that Russell and his lab have shown make a person appear older.


Look closely at the eyes as my fiancé Jake is aged by FaceApp:

They aren’t just smaller. The skin at the base of the eye has been lightened, and the eye color has been dulled, making the eyes stand out less. This seemingly minor alteration—a loss of contrast between features and the skin—is one that Russell’s research has found plays an important role in our perception of older faces.

To make a face look pre-pubescent requires different alterations. Children’s faces have proportionally larger features and smaller foreheads, for example. So to turn an adult into a kid, FaceApp’s algorithm should key into such traits, and of course, remove any strong cues of adulthood like facial hair. And voilà, that’s exactly what it does:


At least, that’s what it does for male faces. Intriguingly, while the app readily transforms men into boys, it seems to struggle with morphing women like me into girls:


That’s likely because women’s faces are much more similar to girls’ faces than men’s are to boys’. “Through the process of puberty, there are more changes that occur to the male face,” Russell told Gizmodo.

Although many changes happen in both sexes—the lower half of the face becomes proportionally larger, the eyes proportionally smaller, the jaw broadens, and brows become heavier, for example—these changes are more pronounced for males. With fewer striking changes, an AI has less to work with in its attempts to rejuvenate female faces.


Gender bender

We don’t often stop and think about what traits make a face look masculine or feminine, but FaceApp’s algorithm appears to have learned to spot the same differences that our brains instinctively use to perceive sex. Male faces have larger and darker brows, heavier jawlines, larger noses, more recessed eyes, and of course, facial hair. So to turn “female” into “male”, FaceApp masculinizes those traits:


There also appear to be subtle changes made by FaceApp that align with cues we subconsciously use to distinguish male from female faces. Russell and his colleagues have shown that male skin (head to toe) is, on average, darker and redder than female skin (regardless of race or ethnicity), though this difference doesn’t extend to other features. “Because there is a sex difference in the skin coloration but not in the coloration of the eyes and the lips, it results in there being more contrast around the features in female faces,” he explained, “so the features stand out more.”

The increased contrast is clear when I create hyper-female me using the female 2 filter. My eyes are not just larger, they’re more defined, and my lips are darker (as if I’m wearing even more lipstick). These alterations are similar to the young filter, but more exaggerated—so much so that hyper-female me appears younger than young me. Meanwhile, female Jake is unmistakably distinct from his usual self (left), yet his hyper-male self (right) looks almost unaltered:


Where FaceApp screwed up 

It’s all fun and games to alter age or swap sexes, but FaceApp drew a lot of criticism early on when it tried to take its transformations one step further and alter attractiveness. Users quickly noticed that the app’s “hot” filter had a strong preference for lightening skin tone and more European features:


The app’s developers quickly apologized for the AI’s racist tendencies, and briefly renamed the filter “spark” to remove the positive connotations (the filter has since been deleted from the app).

So, what went wrong? Without knowing exactly how the app was developed, Russell says it’s impossible to determine why it had such a strong whitewashing effect, but there are a few possibilities. The first is that the images in the training set were racially biased, with too many attractive, white faces—which might make sense if the developers predominantly relied on a sampling from the local population (Russians). But the bias could go deeper than the images themselves, Russell said—it could be the product of how the algorithm was trained, and who was making the initial attractiveness ratings. FaceApp’s developers have yet to disclose how attractiveness was scored for the training set.


It’s important to note that quantifying attractiveness isn’t the same as quantifying facial differences due to age or sex. What is seen as “beautiful” can differ on a myriad of factors, including the beholder’s age, gender, or sexual preference. While scientists have determined some traits that tend to make a person seem more attractive—usually ones that also cue health, like facial symmetry—you can’t objectively quantify attractiveness in the same way you can eyebrow thickness or face shape. “The cues for attractiveness are pretty much always contextual,” said Russell. “There’s no gold standard—there’s no objective measurement.”

Realistic, but not true 

While FaceApp’s transformations are photorealistic, it’s important to note that looking believable doesn’t make them real. The algorithm can create face transformations that look like actual photos, but its results are far from accurate. A quick look at grade school Jake compared with FaceApp’s young Jake illustrates this point:

There’s little in common between the young man on the left—FaceApp’s young filtering of Jake—and what Jake really looked like at the cusp of puberty. Image courtesy of Christie Wilcox
There’s little in common between the young man on the left—FaceApp’s young filtering of Jake—and what Jake really looked like at the cusp of puberty. Image courtesy of Christie Wilcox

FaceApp can’t show you how you’ll actually look in 40 years, just like how it didn’t accurately predict how Jake looked over a decade ago. Nor will it tell you how you’ll look after gender affirming hormone treatment (though the manipulated images of yourself that alter your masculinity or femininity can still be emotionally stirring). 

In the end, FaceApp is a novelty—software designed for entertaining our narcissistic tendencies. It’s not a window to viewing our true past, future, or gender-altered selves. But engineers and scientists have only just started to experiment with the possibilities afforded by neural networks. It’s not hard to envision similar algorithms accurately aging photos of missing kids so they can be reunited with their families, or helping doctors design and build realistic tissue reconstructions. Neural networks are already tackling a myriad of technological questions, and selfie filters are only the beginning.


Christie Wilcox is a science writer, author of Venomous: How Earth’s Deadliest Creatures Mastered Biochemistry, and all around biology nerd. Follow her on Twitter.

Share This Story

Get our `newsletter`



About the “racist” part, funnily enough for some reason, everytime i use the kid filter, it gives me a young face with kind of African-american characteristics and a darker skin tone. I’m actually mexican with a whitish skin tone.

I really don’t find it racist or anything, just made me wonder about their neural network and what characteristics is using to predict my kid face that way.