- Nvidia showcases AI-driven “digital human” technology for gaming avatars at the Game Developers Conference.
- The Covert Protocol demo highlights real-time NPC reactions to player interactions powered by Nvidia’s Avatar Cloud Engine (ACE).
- Collaboration with Inworld AI aims to release source code, fostering wider adoption among developers.
- Partnership with Microsoft targets Xbox tools for AI-driven character and narrative development.
- Audio2Face technology facilitates seamless multilingual game development and facial animations.
- Despite advancements, AI-powered NPCs still lack the conversational depth of real people.
- Concerns arise regarding the potential impact of AI adoption on the video game voice acting industry.
Main AI News:
Nvidia recently unveiled how developers are leveraging its AI-driven “digital human” capabilities to voice, animate, and craft dialogues for gaming avatars. At the Game Developers Conference, the company debuted Covert Protocol, a playable tech demonstration spotlighting the potential of its AI tools to enable non-player characters (NPCs) to dynamically react to player interactions, generating tailored responses in real-time.
In this demonstration, players assume the role of a private investigator, navigating through tasks based on interactions with AI-powered NPCs. Nvidia asserts that each gameplay session is distinct and influenced by players’ interactions, resulting in varied outcomes. John Spitzer, Nvidia’s Vice President of Developer and Performance Technologies, emphasizes that the company’s AI technology could drive the intricate animations and lifelike conversational speech essential for authentic digital interactions.
Covert Protocol, developed in collaboration with Inworld AI, an AI gaming startup, leverages Nvidia’s Avatar Cloud Engine (ACE) technology, the same framework behind Nvidia’s futuristic ramen shop demonstration from last year. While the Covert Protocol demo primarily showcases snippets of NPCs delivering various voice lines, its efficacy in actual gameplay remains untested, with the delivery and lip-syncing animations exhibiting a somewhat robotic quality reminiscent of conventional chatbots.
Inworld AI plans to release Covert Protocol’s source code in the near future to encourage wider adoption of Nvidia’s ACE digital human technology among developers. Additionally, Inworld announced a partnership with Microsoft in November 2023 to facilitate the development of Xbox tools for creating AI-driven characters, narratives, and quests.
Nvidia’s Audio2Face technology was also spotlighted, featuring in a preview of the upcoming MMO World of Jade Dynasty, where a character seamlessly lip-syncs to both English and Mandarin Chinese speech. Audio2Face promises to streamline multilingual game development by automating facial animations, reducing the need for manual reanimation. Another demonstration showcased Audio2Face’s application in the upcoming action melee game Unawake, illustrating its versatility in both cinematics and gameplay sequences.
While these tech demonstrations may inspire game developers to explore integrating AI-driven NPCs into their projects, the conversational aspect still lags behind. Characters in Covert Protocol fail to evoke the impression of “real people,” echoing similar sentiments observed in previous Kairos demos. Such observations are unlikely to assuage concerns among video game voice actors, who fear the potential impact of AI adoption on their careers and livelihoods.
Conclusion:
Nvidia’s advancements in AI-driven NPC technology and multilingual game development tools signify a significant leap forward for the gaming industry. However, the challenge lies in bridging the gap between AI-generated interactions and the nuanced conversational depth of human counterparts. Nonetheless, this innovation sets the stage for more immersive gaming experiences and underscores the need for adaptation within the video game market to accommodate evolving technologies.