NVIDIA and Unreal Engine 5 Enhance AI-Powered MetaHuman Deployment

Rebeca Moen  Oct 02, 2024 14:20  UTC 06:20

2 Min Read

At Unreal Fest 2024, NVIDIA unveiled new Unreal Engine 5 on-device plugins for NVIDIA ACE, designed to streamline the creation and deployment of AI-powered MetaHuman characters on Windows PCs. According to NVIDIA Technical Blog, ACE is a suite of digital human technologies providing speech, intelligence, and animation powered by generative AI.

New Plugins and Features

The latest release includes the Audio2Face-3D plugin for AI-driven facial animations in Autodesk Maya. This plugin offers a simplified interface to expedite avatar development in Maya, complete with source code for further customization in other digital content creation tools.

NVIDIA has also developed an Unreal Engine 5 renderer microservice utilizing Epic’s Unreal Pixel Streaming technology. This microservice supports the NVIDIA ACE Animation Graph Microservice and Linux OS in early access. The Animation Graph Microservice enables realistic character movements, and with Unreal Pixel Streaming, developers can stream MetaHuman creations to any device.

Enhanced Developer Resources

The NVIDIA ACE Unreal Engine 5 sample project guides developers in integrating ACE into their applications. The project expands the number of on-device ACE plugins, including:

  • Audio2Face-3D for lip sync and facial animation
  • The Nemotron Mini 4B Instruct model for response generation
  • Retrieval-augmented generation (RAG) for contextual information

These microservices are optimized for low latency and minimal memory usage on Windows PCs, allowing developers to create responsive MetaHuman characters efficiently.

Streamline 3D Animation with Maya ACE Plugin

Autodesk Maya, known for its high-performance animation capabilities, now supports the Audio2Face-3D plugin, facilitating high-quality, audio-driven facial animations. The plugin’s user interface allows for seamless transition to Unreal Engine 5, and the source code can be tailored for other digital content tools.

To begin, developers should generate an API key or download the Audio2Face-3D microservice. The plugin is part of NVIDIA NIM, a suite of microservices designed to accelerate the deployment of foundational models across clouds and data centers. Developers can access the NVIDIA/Maya-ACE GitHub repository for all necessary resources.

Scale Digital Human Technology Deployment with UE5 Pixel Streaming

Deploying digital human technology via the cloud aims to reach a broad audience. The new Unreal Engine 5 renderer microservice in NVIDIA ACE supports the Animation Graph Microservice and Linux OS, facilitating high-fidelity character streaming. This setup allows MetaHuman characters to run on cloud servers and be streamed to browsers and edge devices using Web Real-Time Communication (WebRTC).

Animation Graph microservice integrates with AI models to create conversational pipelines for characters, maintaining context and history. The UE5 pixel streaming compatibility ensures efficient deployment of high-quality MetaHuman characters in various applications.

Getting Started

Developers interested in the Unreal Engine 5 renderer microservice can apply for early access. More information about NVIDIA ACE and the NIM microservices is available on the NVIDIA Technical Blog.



Read More