Disney Is Building Facial Recognition to Figure Out When You'll Laugh During Toy Story 5

We may earn a commission from links on this page.

The Walt Disney Company is using AI to determine how much audiences enjoy every single moment of their films.

At IEEE’s Computer Vision and Pattern Recognition last weekend, Disney Research and Caltech explained their technique for tracking the facial expressions of people watching movies.

The research team calls their new algorithm “factorized variational autoencoders” (FVAEs). They claim the technology is so effective at recognizing complex expressions that, after analyzing a single audience member’s face for about ten minutes, it can even predict that face’s future expressions throughout the remainder of a film.


In order to build a dataset of millions of facial landmarks to feed into a neural network, researchers used infrared cameras to film the audiences of 150 showings of nine movies, including recent Disney films Star Wars: The Force Awaken, Zootopia, Inside Out, and Big Hero 6.


The resulting AI system was then tested on other audiences. After it tracked moviegoers’ facial reaction patterns for a few minutes, FVAEs were then able to predict when they would smile or laugh (as tied to significant moments in the films), outperforming other methods of predictive analysis given far more data. As research progresses, FVAEs could also presumably track other emotions like fear and sadness.


Disney isn’t the only film company that wants to better understand how viewers react to movies. Over the last five years, Dolby Laboratories has been studying movie watchers on a neurophysiological level, strapping biosensors onto volunteers to track their reactions as they experience movies. The company does these tests to prove that their audio and visual technology can elicit stronger responses than their competition.

But it will be interesting to see how Disney, the world’s second-largest media conglomerate, uses the data they collect from tracking audiences faces, as this AI facial tracking system could help them understand audience reactions much better than human market researchers. “Understanding human behavior is fundamental to developing AI systems that exhibit greater behavioral and social intelligence,” said Caltech machine learning professor Yison Yue in a press statement. “After all, people don’t always explicitly say that they are unhappy or have some problem.”


Critics, actors, and directors already complain that the demands of Hollywood studios are stripping directors of their creative control. Imagine what the production and editing process will be like once executives can use AI to ensure each scene evokes the appropriate response.

[Caltech, Disney Research, Tech Crunch]