Unreal Renderer Microservice#

The Unreal renderer microservice wraps the Unreal Engine real-time renderer. The purpose of this microservice is to render a single avatar and its scene in real-time. The current pose of the avatar as well as audio are provided over an gRPC endpoint and the rendered frame is then streamed out together with audio over WebRTC.

More information about the Unreal Engine can be found in Epic’s Unreal Engine documentation.

Avatar scene rendered with the Unreal Engine real-time renderer.

Microservice Information#

Service Name

ucf.svc.ia-unreal-renderer-microservice:0.1.2

Publishing Location

nvidia/ace

Previous Versions

License

NVAIE

NGC Enablement

Public

Unreal Engine#

The microservice wraps the Unreal Engine that is used to render an avatar scene.

More information

Streaming#

The rendered frame is streamed out of the microservice using Unreal Engine’s WebRTC protocol implementation called Pixel Streaming.

More information

Animation Data#

The primary function of the Unreal rendering microservice is the real-time rendering of an avatar pose. This pose is provided through a gRPC interface in the form of Animation Data. In addition, the avatar scene is provided as a static asset. The renderer takes the avatar asset and the real-time pose data and renders an avatar frame in real-time.

More information

Avatar Customization#

This rendering microservice stands as a versatile and powerful tool for real-time rendering, supporting a range of use cases and customization options. Users have the flexibility to create and customize scenes in two distinct ways. Avatar can be effortlessly generated and tailored using MetaHuman Creator, or for more advanced users, manual Unreal project creation is also supported.

More information

Typical Use Case#

Typically, the Animation Graph microservice serves as the source of the Animation Data.