Legacy Live Link Interface (Deprecated)#
Deprecated since version 2.0.
The ACE Audio2Face plugin allows Audio2Face and Avatar Character Engine (ACE) to stream or burst blendshape coefficients and wave audio into Unreal Engine. The blendshape coefficients are sent to the Live Link interface to drive animation of MetaHuman or other face types. The wave audio data is received and submitted to the ISubmixBufferListener interface to be played simultaneously with the facial blendshape coefficients.
The plugin can operate in two modes when receiving blendshape coefficients: Streaming or Burst.
Basic Instructions in Unreal Engine#
Open the Window > Virtual Production > Live Link tab.
Create a new
NVIDIA Omniverse LiveLink
Source.Modify the
AnimBlueprint
/Game/MetaHumans/Common/Face/Face_AnimBP
so theLLink Face Subj
is set toAudio2Face
.This may not be possible until a connection is made to the plugin with the
Audio2Face
subject name.
Open a map with a MetaHuman actor that includes a
LiveLink Skeletal Animation
component.Verify that everything is setup so that you can receive blendshape and audio data.