AR SDK API Architecture#
Use the NVIDIA® AR SDK to enable an application to use the face tracking, facial landmark tracking, 3D face mesh tracking, 3D Body Pose tracking, and Eye Contact features of the SDK.
- Working with Features
- Creating an Instance of a Feature Type
- Getting and Setting Properties for a Feature Type
- Setting Up the CUDA Stream
- Summary of AR SDK Accessor Functions
- Key Values in the Properties of a Feature Type
- Getting the Value of a Property of a Feature
- Setting a Property for a Feature
- Loading a Feature Instance
- Running a Feature Instance
- Resetting a Feature Instance
- Destroying a Feature Instance
- Working with Image Frames on GPU or CPU Buffers
- Properties for the AR SDK Features
- Face Tracking Property Values
- Landmark Tracking Property Values
- Face 3D Mesh Tracking Property Values
- Eye Contact Property Values
- Body Detection Property Values
- 3D Body Pose Keypoint Tracking Property Values
- Facial Expression Estimation Property Values
- Video Live Portrait Property Values
- Frame Selection Property Values
- Speech Live Portrait Property Values
- LipSync Property Values
- Using the AR Features
- Using Multiple GPUs