For those who find reading a foreign film’s subtitles too distracting, movies are often dubbed into different languages. But that can be equally distracting when the movements of an actor’s mouth are completely out of sync with what they’re saying. So a company called Flawless has created an AI-powered solution that will replace an actor’s facial performance to match the words in a film dubbed for foreign audiences.
There are very good reasons to be concerned about the use of artificial intelligence and neural network-powered tools to manipulate stills and videos to create situations that never actually happened or making public figures say something they didn’t (in the case of deepfakes that have spread like wildfire online). But the tools also have exciting potential for other uses—like continuing to make filmmaking more accessible and affordable.
The concerns mirror the same anxieties people had when computer graphics advanced in leaps and bounds thanks to films like Terminator 2, Twister, and Jurassic Park, which advanced photo-realistic visual effects in ways we had never seen before. Those advancements eventually trickled down to the consumer, and with just a smartphone and a laptop, amateur creators can produce Hollywood-caliber films without the need for a Hollywood-sized budget. Those tools have also become easier and easier to use thanks to continued breakthroughs in AI-powered image processing. Erasing objects by hand in a video clip was once a very time-consuming process requiring complex masking and frame-by-frame adjustments, but it’s now a standard and automated feature in Adobe’s After Effects.
What Flawless is promising to do with its TrueSync software is use the same tools responsible for deepfake videos to manipulate and adjust an actor’s face in a film so that the movements of their mouths, and in turn the muscles in their faces, more closely match how they’d move were the original performance given in the language a foreign audience is hearing. So even though an actor shot a film in English, to a moviegoer in Berlin watching the film dubbed in German, it would appear as if all of the actors were actually speaking German.
Is it necessary? That’s certainly up for debate. The recent Academy Award-winning film Parasite resurfaced the debate over dubbing a foreign film versus simply watching it with subtitles. One side feels that an endless string of text over a film is distracting and takes the focus away from everything else happening on screen, while the other side feels that a dub performed by even a talented and seasoned voice artist simply can’t match or recreate the emotions behind the original actor’s performance, and hearing it, even if the words aren’t understood, is important to enjoying their performance as a whole.
Flawless says: “At the heart of the system is a performance preservation engine which captures all the nuance and emotions of the original material.” So despite an actor’s face being replaced, their original nuanced performance will be preserved and carried over to the ‘fixed’ face featuring the correct lip movements for a foreign language. The company has shared a few examples of what the TrueSync tool is capable of on its website, and sure enough, Tom Hanks appears to be speaking flawless Japanese in Forrest Gump.
But like the Uncanny Valley that continues to plague our attempts to create realistic and believable CG people, there’s still something off with these TrueSync processed clips. They don’t look as natural as the original performances do, and the technology behind it still being in its infancy is partly to blame for that. Even companies like Disney are working to improve the quality of deepfake technology so that it’s soon good enough for what Hollywood demands. The bigger question is whether or not this approach is more or less distracting than the options we have now for making a film accessible to a wider audience, and if it’s worth the added cost of updating an entire two-hour film.