Overview of the Template Scene Content#

A labeled screenshot of the asset list found in the Template Scene.

1 AnimationGraph_Camera#

This Animation Graph isn’t currently being used. The idea would be to control the camera in much the same way as the character. The camera would be placed based on the location of a joint in the camera rig (Rig_Camera).

2 AnimationGraph#

This Animation Graph controls the characters movements. The Animation Graph Microservice uses this Graph to generate a pose based on Audio2Face input plus multiple layers of animation clips. This pose is then streamed to the OV Renderer Microservice, overwriting the animation “ACE_Animation_Target” which controls the visible character. This Graph references the default Skeleton (Rig_Retarget). That skeleton needs to be identical to the one all animations are based on.

3 Animations#

This is where all SkelAnimations are located. The folder structure is used to automatically generate parts of the Animation Graph. The script “animation_graph_builder_behavior_script.py” will create nodes and assign posture and gesture animations. (The script is only executed when the scene is opened and the previously generated state machines have been deleted.)

4 Gestures#

All animations in this folder will be added to the “GesturesStateMachine” inside the Animation Graph when it gets regenerated. Gestures are animations meant to be triggered temporarily. E.g. The character waves and then returns to the previous posture (E.g. Talking).

5 Root_Movements#

These are animations that move the root position of the character. E.g. Taking a step in a direction. These animations are used in a state machine called “PositionStateMachine”. This one is not automatically generated. These animations are similar to gestures, but they have a lasting effect.

6 Postures_Subtle#

The template scene comes with two sets of postures. One “subtle” and one that’s more cartoonish (Postuers). In the template scene the subtle animations have been used, but if the Animation Graph is regenerated the cartoonish set will be used. That’s because the regeneration will use whichever scope is labeled “Postures”. So, if you’d like to keep using the subtle animations when regenerating the graph, you’ll need to rename that folder to “Postures”. You can do this in the layer called “Animations_Collected.usda”.

7 Facial_Postures#

This is where all the facial postures are stored. Facial postures and gestures work similar to the regular postures and gestures, but they only include blendshapes and relative motion of the head. They are blended in additively, so these animations have to be prepared in the right way. The rest pose should be subtracted from them - Meaning all joints will be located near the zero position so these animations can be used as a relative offset. By default the only “Facial_Posture” is a blinking animation.

8 Facial_Gestures#

These are gestures of the face (e.g. a smile). They need to be prepared in the way described under “Facial_Postures”.

9 Tests#

These are test animations which are used as placeholders. They are useful as a debugging tool. If the character is seen in a test pose, that means the Animation Graph got stuck somewhere.

10 Animations_Camera#

These are animation clips for the “AnimationGraph_Camera” which isn’t currently used.

11 Shots#

Shots are the equivalent of a “posture”, but for the camera. (Not currently used)

12 Effects#

Effects are the equivalent of a “gesture”, but for the camera. (Not currently used)

13 SkelRoots#

This prim will be dynamically duplicated for every new stream the Animation Graph Microservice is running. That’s why all Skeletons that are controlled by an Animation Graph are a child of this prim.

14 Rig_Retarget#

This is the skeleton used by the Animation Graph and all animations are based on it. Altering this rig means having to change every animation accordingly. Though in most cases this won’t be necessary, as the animations are applied to the visible character using retargeting tags.

15 SkelRoot (Rig_Retarget)#

This SkelRoot needs to have the Animation Graph assigned to it. This is how the character is animated.

16 Meshes (Rig_Retarget)#

These meshes are invisible, since they don’t need to be rendered. They’re only there for debugging purposes, to see the Animation Graph results in an app like USD Composer. Since only this character is directly controlled by the Graph, only this character is a truthful representation of what animations will be when running the pipeline.

17 Skeleton (Rig_Retarget)#

By default this Skeleton is a version of the NV-Human rig, and should have plenty of retarget tags for other humanoid characters to be controlled by it. This skeleton is referenced by the Animation Graph.

18 ACE_Animation_Target#

The visible character’s skeleton has to point to this animation. Once deployed, the content of this animation will be overwritten by the poses streaming from the Animation Graph Microservice.

19 DemoAnimation#

This is merely a test animation for users to visualize how well the retargeting was set up. It serves no purpose after that.

20 Rig_Camera#

This is the skeleton used by the “Animations_Camera” Animation Graph. It would be duplicated for each stream exactly like the Rig_Retarget Skeleton and control the camera in the same way that rig controls the character. But this isn’t currently used.

21 Scene_Empty#

An example of a background scene and lights. This can be used as is, or be replaced by a custom scene. It has no functionality other than visuals.

22 Example#

This is an example of a visible character. This dummy would be replaced with a custom character which should be set up in the same way: Have ARKit blendshapes, have retargeting tags, and point to the ACE_Animation_Target animation.

23 camera_main#

The camera. In the future this might be controlled by animations, but for now remains static.