Final Fantasy VII Remake heavily relies on the use of AI for character facial expression, lip-syncing and more.
The remake of 1997’s Final Fantasy VII is without a doubt one of the most anticipated titles of the past few years. The title was officially announced back in 2015, and after years of development, fans will be able to play the remake next week.
As revealed in the latest issue of Edge magazine (issue #344), the Final Fantasy VII Remake team did not only work on the game itself but also worked on AI technology to make the game’s characters as realistic as they could be. According to Final Fantasy VII Remake co-director Naoki Hamaguchi, the game heavily uses AI to detect emotions in dialogues in order to display facial expressions.
In addition, the advanced AI tech is used for lip-syncing and camera movement by calculating the distance and angle from characters in a particular scene to move the in-game camera.
"You can generally tell the emotional content of any piece of dialogue from the intonation and patterns, if you look at it on a graph and at how the levels of tension go up and down", Hamaguchi said.
"So we took a number of samples from various different voice data, downloaded them into a database, and then looked for the patterns to create a system where we could get a very high level of recognition on the actual emotional content of any piece of dialogue."
That’s some pretty impressive tech right there for the upcoming remake.
Final Fantasy VII Remake launches globally on April 10, but Square Enix has already shipped the game early to make sure that as many as possible will be able to play the game on launch day.