In a previous blog, we discussed Mass Effect Andromeda and the widespread criticism that the game received because of its facial animation. In the original release, many of the characters had issues with their eye movements. In addition, the lip movement didn’t sync properly with the dialogue in several scenes. Mass Effect Andromeda developers have since released a patch that fixed many of the facial animation issues that plagued the game before.

However, the criticism the developers received because of it was intense. Naughty Dog's Jonathan Cooper, who animated on the first Mass Effect, even took to Twitter to defend the game developers. He explained some of the reasons why facial animation is incredibly hard. Those that are overly critical of games like Andromeda, probably don’t realize the great lengths that many studios and individual animators go through in order to produce realistic facial animation.

Here are some of the key reasons why animating faces is a challenge, and ways that technology and innovations in the industry are making the process a little easier for animators.

1.  Game animations aren’t always scripted.

One of the main differences between film and game animation is scripted and unscripted scenes. Films are made completely with scripted scenes. This means that animators have the exact words that characters will say and can sync those with lip movements. They can manually go over each detail to make sure that it matches with the dialogue. When a film is played, it is exactly the same every time.

This is not always the case for game animators, especially in RPGs. There are both scripted and unscripted scenes in games. Players can choose different character choices that make the game more interactive, but also create a vast amount of different scenarios or animation possibilities. They have greater control over what the characters say and do, and game engines are expected to react to commands in real-time. It’s not possible to manually animate all of these unscripted scenes, so animators rely on algorithms that automate the lip sync process.

This is a huge time and cost saver. NaturalFront’s One-Click Animation offers the ability to match up audio files to lip movements in a matter of minutes. However, many professional animators will still go through and review the animation for quality purposes. When up against a deadline, game animators may forego this process.

2. It is easy to slip into the uncanny valley.

We’re at an interesting point in game animation. Characters can look so life-like that you feel like you could almost reach through the screen and touch them. However, there is also what is known as the uncanny valley that animators constantly have to combat. The uncanny valley is when animations appear so real but are just off enough that they have a creepy quality. It is most detectable in the eyes of characters, showing a zombie-like appearance.

It only takes a second of flawed footage to unravel hours of work. There are some keys to avoiding the uncanny valley, such as getting rid of inconsistencies and using real-life references to shape your animations. However, the uncanny valley is something that everyone in the animation industry has likely faced one time or another.

3. There are over 10,000 facial expressions.

The face is complex. It is made of an intricate system of muscles and is one of the most unique features on a person’s body. We say a lot with our facial expressions, and even the most subtle of changes can alter a person’s meaning completely.

Part of capturing the right facial expression for the emotion that you are trying to convey is a solid foundation of animation principles and knowledge of human anatomy. Another, is knowing which technologies and tools can make the process easier, faster and smoother.

NaturalFront’s Facial Animation Software is designed specifically to improve the process for animators and hobbyists, so that it is possible to create anatomically accurate and expressive facial animations in a matter of minutes.