NVIDIA officially announced the general availability of ACE Creation AI microservices, a suite of products designed to accelerate the development of life-like digital humans. According to the NVIDIA Newsroom, this significant technological advancement aims to revolutionize industries such as gaming, healthcare, and customer service.
Functionality extension through ACE
The Avatar Creation Engine (ACE) family of microservices includes a variety of technologies that enable the creation and animation of realistic digital humans. These technologies are now generally available for cloud deployment and early access to RTX AI PCs. Companies like Dell Technologies, ServiceNow, and Perfect World Games have already begun integrating ACE into their operations.
Technologies included in the ACE family include:
- Nvidia Riva For automatic speech recognition (ASR), text-to-speech (TTS), and neural machine translation (NMT).
- Nvidia Nemotron For understanding language and generating context-sensitive responses.
- NVIDIA Audio2Phase For realistic facial animation based on audio tracks.
- NVIDIA Omnibus RTX Render realistic skin and hair with real-time path tracing.
Future technologies beyond the horizon
NVIDIA also announced upcoming technologies such as NVIDIA Audio2Gesture, which generates body gestures based on audio tracks, and NVIDIA Nemotron-3 4.5B, a new compact language model designed for low-latency, on-device RTX AI PC inference.
“Digital humans will revolutionize industries,” said Jensen Huang, founder and CEO of NVIDIA. “The breakthroughs in multimodal large language models and neurographics that NVIDIA ACE brings to the developer ecosystem will bring us closer to a future of intent-based computing where interacting with computers is as natural as interacting with humans. “
ACE’s activities
To date, NVIDIA has provided ACE as an NVIDIA Inference Model (NIM) microservice for developers to work with in their data centers. The company is currently scaling this offering to more than 100 million RTX AI PCs and laptops. The new AI Inference Manager SDK simplifies ACE deployment on PC by pre-configuring the necessary AI models and dependencies.
At COMPUTEX, NVIDIA presented an updated version of its Covert Protocol technology demo developed in collaboration with Inworld AI. The demo allows players to interact with digital human NPCs using conversational language to complete missions.
Industry Adoption and Future Outlook
Companies such as Aww Inc., Inventec, and Perfect World Games are leading the adoption of ACE technology. Aww Inc. plans to use the ACE Audio2Face microservice for real-time animation of the virtual celebrity Imma. Perfect World Games is integrating ACE into Legends, a mythic wilderness tech demo that allows players to interact with multilingual AI NPCs.
Inventec is using ACE to enhance its healthcare AI agents within its VRSTATE platform to deliver more engaging virtual consultation experiences. ServiceNow showcased ACE NIM in a generative AI service agent demo for the Now Assist Gen AI Experience, highlighting its potential to improve customer and employee interactions.
Innovations at COMPUTEX 2024
The NVIDIA art team also leveraged generative AI tools built on ACE to create a “digital Jensen” avatar for COMPUTEX 2024. This multilingual avatar features Huang’s unique voice and style, created using ElevenLabs’ AI speech and voice technology in Mandarin Chinese and English.
ACE NIM microservices, including Riva and Audio2Face, are now in production and NVIDIA AI Enterprise software has been added to provide developers with enterprise-grade support. Early access to ACE NIM microservices running on RTX AI PCs is also available.
Image source: Shutterstock
. . .
tag