Nvidia is using AI to turn game characters into chatbots.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Nvidia is showing how developers have begun using its AI “digital human” tools to create voices, animations and dialogue for video game characters. At the Game Developers Conference on Monday, the company released a clip. Secret protocola playable tech demo that demonstrates how its AI tools can allow NPCs to respond to player interactions in unique ways, creating new reactions suited for live gameplay.

In the demo, players take on the role of a private detective, completing objectives based on interactions with AI-powered NPCs. Nvidia claims that each playthrough is “unique”, with real-time player interactions leading to different game outcomes. John Spitzer, Nvidia’s vice president of developer and performance technologies, says the company’s AI tech can “power the complex animations and conversational speech needed to make digital interactions feel real.”

Secret protocol Inworld AI was created in collaboration with an AI gaming startup, and uses Nvidia’s Avatar Cloud Engine (ACE) technology — the same technology that powered the futuristic Ramen Shop demo that Nvidia released last May. was new Secret protocol The demo doesn’t show how effective these AI-powered NPCs are for actual gameplay, instead showing a selection of clips of NPCs spitting out various vocal lines. Both the line delivery and the lip syncing animation feel robotic, as if a real chatbot is talking to you through the screen.

Inworld says it plans to release Secret protocolsource code “in the near future” to encourage other developers to adopt Nvidia’s ACE Digital Human Tech. Inworld also announced a partnership with Microsoft in November 2023 to help develop Xbox tools for creating AI-powered characters, stories, and quests.

Nvidia’s Audio2Face tech was also featured in a clip from the upcoming MMO. The world of the Jade family, which demonstrated single-character lip synchronization in both English and Mandarin Chinese speech. The idea is that Audio2Face will make it easy to create games in multiple languages, without having to manually re-animate characters. Another video clip from the upcoming action melee game clueless It demonstrates how Audio2Face can be used to create facial animations during both cinematics and gameplay.

These tech demos may be enough to convince game developers to experiment with adding AI-powered NPCs to their titles, but at least interactively, things don’t seem to have progressed much. are I role Secret protocol Don’t feel like “real people” compared to previous Kairos demos. But that’s unlikely to calm angry video game voice actors who worry about how the adoption of AI will affect their careers and livelihoods.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment