NVIDIA unveiled its latest advancements in digital human technology at the SIGGRAPH 2024 conference. These innovations aim to transform customer interactions across a range of industries using hyper-realistic digital avatars and cutting-edge AI.
About James: Digital Brand Ambassador
One of the highlights of NVIDIA’s presentation was James, a conversational digital human designed to connect with people using emotion, humor, and contextual understanding. James is powered by NVIDIA’s Avatar Creation Engine (ACE) and leverages Augmented Search Generation (RAG) to provide accurate and informative responses. James uses NVIDIA’s RTX rendering technology and ElevenLabs’ natural speech technology to provide lifelike animations and voices.
NVIDIA Maxine’s Role in Empowering Digital Humans
NVIDIA also showcased the latest improvements to its Maxine AI platform, which improves the audio and video quality of digital humans. Maxine 3D transforms 2D video inputs into 3D avatars, making it ideal for applications such as video conferencing. Audio2Face-2D animates static portraits based on audio inputs, creating dynamic digital humans from a single image. Both technologies are currently in early access.
Industry Adoption and Applications
Several companies are already leveraging NVIDIA’s digital human technology. HTC integrated Audio2Face-3D into its VIVERSE AI agent to improve user interaction with dynamic facial animation. Looking Glass is using Maxine’s 3D AI capabilities to create real-time holographic feeds for spatial displays. Reply used NVIDIA’s ACE and Maxine technologies to develop Futura, a digital assistant for Costa Crociere’s cruise ships.
UneeQ is another notable adopter, demonstrating cloud-rendered digital humans and advanced avatars powered by NVIDIA GPUs and AI models. These technologies promise to deliver more natural and responsive virtual customer service experiences.
For more information, visit the NVIDIA blog.
Image source: Shutterstock