NVIDIA has unveiled its latest advancements in digital human technologies at the SIGGRAPH 2024 conference. These innovations aim to revolutionize customer interactions across various industries through the use of hyperrealistic digital avatars and cutting-edge AI.
Introducing James: A Digital Brand Ambassador
One of the highlights of NVIDIA’s presentation is James, an interactive digital human designed to engage with people using emotions, humor, and contextual understanding. James is built on NVIDIA’s ACE (Avatar Creation Engine) and utilizes retrieval-augmented generation (RAG) to provide accurate, informative responses. James leverages NVIDIA’s RTX rendering technologies and ElevenLabs’ natural-sounding voice technology to offer lifelike animations and speech.
NVIDIA Maxine’s Role in Enhancing Digital Humans
NVIDIA also showcased the latest improvements to its Maxine AI platform, which enhances the audio and video quality of digital humans. Maxine 3D converts 2D video inputs into 3D avatars, making it ideal for applications like video conferencing. Audio2Face-2D animates static portraits based on audio input, creating dynamic digital humans from a single image. Both technologies are currently in early access.
Industry Adoption and Applications
Several companies are already utilizing NVIDIA’s digital human technologies. HTC has integrated Audio2Face-3D into its VIVERSE AI agent, enhancing user interactions with dynamic facial animations. Looking Glass is using Maxine’s 3D AI capabilities to create real-time holographic feeds for its spatial displays. Reply has developed Futura, a digital assistant for Costa Crociere’s cruise ships, using NVIDIA’s ACE and Maxine technologies.
UneeQ is another notable adopter, showcasing demos of cloud-rendered digital humans and advanced avatars powered by NVIDIA GPUs and AI models. These technologies promise to deliver more natural, responsive virtual customer service experiences.
For more information, visit the NVIDIA Blog.
Image source: Shutterstock
Credit: Source link