NVIDIA researchers have demonstrated the impressive capabilities of real-time generative AI for creating immersive 3D environments. During a live demo at the Real-Time Live event at SIGGRAPH 2024, they showed how NVIDIA Edify, a multimodal architecture for visual generative AI, can rapidly build detailed 3D landscapes.
Accelerate 3D world building
This demo, which took place in one of the key sessions at the prestigious graphics conference, highlighted how an AI agent powered by NVIDIA Edify could compose and edit a desert landscape from scratch in just five minutes. The technology acts as an artist’s assistant, significantly reducing the time needed to generate ideas and create custom ancillary assets that would otherwise have to be sourced from repositories.
These AI technologies can help 3D artists be more productive and creative by drastically reducing idea generation time. Instead of spending hours finding or creating background assets or 360 HDRi environments, artists can create them in minutes.
From concept to 3D scene
Creating an entire 3D scene is typically a complex and time-consuming task. But with the help of AI agents, creative teams can quickly bring concepts to life and iterate to achieve the desired look. In a Real-Time Live demo, researchers used AI agents to instruct NVIDIA Edify-based models to generate dozens of 3D assets, including cacti, rocks, and a bull skull, with previews generated in seconds.
The AI agent then used other models to create potential backgrounds and layouts for placing objects within the scene. This demo demonstrated the agent’s adaptability to last-minute creative changes, such as quickly turning a rock into a gold nugget. Once the design plan was established, the AI agent generated full-quality assets and rendered the scene into a photorealistic image using NVIDIA Omniverse USD Composer.
Features of NVIDIA Edify
NVIDIA Edify models accelerate the creation of background environments and objects using AI-based scene generation tools, while empowering creators to focus on hero assets. The Real-Time Live demo featured two Edify models.
- Edify 3D: Generate editable 3D meshes from text or image prompts, and create previews with rotation animations in seconds, enabling makers to quickly prototype.
- Edify 360 HDRi: Create high dynamic range images (HDRi) of up to 16K natural landscapes using text or image prompts to use as backgrounds and scene lighting.
The demo also showcased a large-scale language model and an AI agent powered by USD Layout, an AI model that generates scene layouts using OpenUSD, a platform for 3D workflows.
Industry Adoption
At SIGGRAPH, NVIDIA announced that leading creative content companies are using NVIDIA Edify-based tools to boost productivity with generative AI. Shutterstock launched the commercial beta of its Generative 3D service, enabling creators to quickly prototype and generate 3D assets using text or image prompts. The service’s 360 HDRi generator, powered by Edify, is also in early access.
Getty Images has updated its Generative AI service with the latest version of NVIDIA Edify, enabling users to create images twice as fast, with improved output quality, faster compliance, advanced control, and fine-tuning.
Compatibility with NVIDIA Omniverse
3D objects, environment maps, and layouts created using Edify models are structured in USD, a standard format for describing and constructing 3D worlds. This compatibility allows artists to seamlessly import Edify-based creations into Omniverse USD Composer. Within Composer, artists can further modify the scene by changing object positions, shapes, or adjusting lighting.
Realtime Live is one of the most anticipated events at SIGGRAPH, showcasing real-time applications such as generative AI, virtual reality, and live performance capture technology.
For more information, visit the NVIDIA blog.
Image source: Shutterstock