EA Wants To Automate Emotions Of In-Game Characters Using AI

The AI will predict the facial expressions of characters and produce them after analyzing several variables.

Story Highlights

  • EA has published a patent that seeks to create the facial emotions of in-game characters using AI. It will generate facial expressions after studying the character’s poses and other data.
  • The AI model will receive character poses and other info as inputs to predict and produce the in-game character facial emotions that are appropriate on the spot.
  • The patent seeks to help devs by automatically generating emotions for characters of all types, which could speed up the development process and huge storage required for games.
  • MMORPG games featuring a lot of in-game characters could use the system to generate emotions automatically. It can be used in cases where motion capture is not capable.

EA has churned out a slew of patents that seek to help game developers to automate various aspects of game development, and its recent patent could solve one of the biggest current dilemmas in the gaming industry. The giant conglomerate has published a new patent that seeks to automate the facial expressions of in-game characters by predicting and creating them through the wonders of AI, essentially replacing eons of manual tasks.

The patent dubbed “PREDICTING FACIAL EXPRESSIONS USING CHARACTER MOTION STATES” will analyze the current character and situation to produce the most appropriate emotions on the face using AI, ensuring that the developers do not have to spend time accounting for every single moment to add facial expressions in a game. The patent discusses how the currently used methods require tedious tasks that can be overcome using AI.

As each character’s range of facial expressions may be required to be modeled separately, modeling the facial expressions can require a large amount of work. For example, a game developer may experience delays and/or difficulties when identifying particular facial expressions to map to characters at different stages of the video game,” mentions EA.

The image shows an example diagram for providing a pose of a character to AI.

Thus, the patent proposes an AI-based system that will rely on a character’s particular pose during gameplay to automatically generate facial expressions. The pose means the position and orientation of the character, including the various data sets associated with it. For instance, a character could be 80% happy, 10% puzzled, and 10% excited, and the AI would utilize the information to generate automated emotions.

The trained machine learning model will receive character poses as inputs to predict and produce the facial emotions that are appropriate on the spot.

The machine learning model can predict the facial expression of the character. Therefore, the system, using input identifying the pose of the character, can generate facial expression parameters and/or a facial expression for the character.”

The image shows generating a facial expression of a character based on the pose of the character.
The image shows generating a facial expression of a character based on the pose of the character.

The facial expression parameters will first be identified to recognize a specific character pose before the AI can come up with a latent facial expression of in-game characters. The patent dives deeply into the technical jargon of the automated system, noting the detailed working through advanced limbo. AI could become a notable player in automatically creating emotions in a game character by predicting the various poses and other data.

Currently, it is nearly impossible for motion capture to work in all situations, like when the character is wearing a mask; the patent will resolve that impediment. Relying on AI to predict and produce emotions in characters’ faces will also reduce the portion of development time and the storage space that is required for the project, as it will significantly lower the amount of manual facial expressions that need to be coded into the game.

The image shows a flowchart for generating a facial expression of a character based on a pose.
The image shows a flowchart for generating a facial expression of a character based on a pose.

The proposed system by EA could help its game devs to create more lifelike characters on a large scale while simultaneously scaling down the manual work demanded by the devs. It will prove to be especially beneficial in MMORPG titles that feature a lot of characters, as explicitly mentioned in the patent. It seems like EA will utilize the system for a wide variety of its games and all sorts of in-game characters. 

EA has previously published a plethora of intriguing patents, including a legal document that seeks to potentially ban players for partnering up with enemy teams during gameplay. It also secured a patent to automate in-game challenges based on real-life events and an automated coaching system for online gaming. 

Did you find this helpful? Leave feedback below.

Thanks! Do share your feedback with us. ⚡

How can we make this post better? Your help would be appreciated. ✍

Get up-to-speed gaming updates delivered right to your inbox.

We don’t spam! Read more in our privacy policy.

Source
Patentscope

Shameer Sarfaraz is a Senior News Writer on eXputer who loves to keep up with the gaming and entertainment industries devoutly. He has a Bachelor's Degree in Computer Science and several years of experience reporting on games. Besides his passion for breaking news stories, Shahmeer loves spending his leisure time farming away in Stardew Valley. VGC, IGN, GameSpot, Game Rant, TheGamer, GamingBolt, The Verge, NME, Metro, Dot Esports, GameByte, Kotaku Australia, PC Gamer, and more have cited his articles.

Experience: 4+ Years || Education: Bachelor in Computer Science.

Related Articles