The Disney research team is using deep learning techniques that allow you to find out the facial expressions of the audience before projecting a film. Through artificial intelligence, Disney can know how we react as we watch a movie and evaluate the emotions we feel. The algorithm used is called “factorised variational autoencoders” (FVAEs) and is able to predict how a viewer will react during a movie by simply analyzing their facial expressions for ten minutes. “The FVAEs have been able to learn concepts like smiling and laughing on their own, and how these expressions correlate with humorous scenes,” explained Zhiwei Deng, a member of the company’s research team Phys.org. This experiment was performed in a theater with capacity for 400 people where four infrared cameras will be placed. The company team was recording the audience during 150 screenings of nine films including “The Jungle Book”, “Big Hero 6” or “Star Wars: The Awakening of Strength” or “Zootopia”. The final result showed 16 million facial reference points in 3,179 spectators. The collected data are automatically interpreted into a series of numbers representing facial expressions such as smile or eyes open. These numbers are connected through meta data, allowing the system to evaluate how an audience is reacting to a movie. And thus to predict the behavior of the spectators during the projection of a feature film. AI can also help us in other ways, for example in caring for the elderly, picking up the signals that emit their body language. “People do not always explicitly say they are unhappy or have a problem,” said Yisong Yue, who has also participated in this study. So, what do you think about this? Simply share your views and thoughts in the comment section below.