API Specification

Ingress

gRPC API

The Omniverse Renderer microservice accepts an input animation data stream from an animation data compositor (e.g. Animation Graph microservice). This animation data stream is requested through the PullAnimationDataStream remote procedure call (RPC). The renderer then updates the avatar pose, renders the image, and streams the image over RTP to a downstream component. The RPC is defined in the Animation Data Service.

Animation Data Stream Audio Format

The Omniverse Renderer microservice currently only supports animation data streams with the following audio format.

Format

PCM

Channel Count

1

Sample Rate

16000 Hz

Bit Per Sample

16

HTTP API

In addition to the animation data stream APIs, the Omniverse Renderer microservice also exposes an HTTP API. With this API you can:

  • Add and remove active streams to the microservice.

You can find more information in the HTTP API documentation.

An interactive HTTP API overview can be accessed once the microservice has been started at http://localhost:8021/docs.

Stream Management

The Omniverse Renderer microservice is a stateful service, which only starts streaming after a stream has been registered with a stream ID.

An animation data stream can be added with:

stream_id=$(uuidgen)
curl -X POST -s http://localhost:8021/streams/$stream_id

Similarly, an animation data stream can be removed with:

curl -X DELETE -s http://localhost:8021/streams/$stream_id

Please note, that the Omniverse Renderer microservice only supports a single stream.

Egress

RTP over UDP Streaming

When a stream is added to the renderer, the renderer will first negotiate the audio and video ports for the RTP over UDP streaming with the downstream client (e.g. VST) through the CreateUDPConnection RPC call. The CreateUDPConnectionRequest message contains the information required to initiate the UDP port negotiation with the downstream microservie. The CreateUDPConnectionReply response message from the downstream service contains the negotiated audio port, video port, and the downstream microservice IP address.

// Copyright (c) 2022-2023 NVIDIA Corporation.  All rights reserved.
//
// NVIDIA Corporation and its licensors retain all intellectual property
// and proprietary rights in and to this software, related documentation
// and any modifications thereto.  Any use, reproduction, disclosure or
// distribution of this software and related documentation without an express
// license agreement from NVIDIA Corporation is strictly prohibited.

syntax = "proto3";

package vstserver;

service VstGrpcServer {
  rpc CreateUDPConnection (CreateUDPConnectionRequest) returns (CreateUDPConnectionReply) {}
}

message CreateUDPConnectionRequest {
  message VideoParams  {
    string codec = 1;
    int32 framerate = 2;
  }

  message AudioParams  {
    string codec = 1;
    int32 sample_rate_Hz = 2;
    int32 bits_per_sample = 3;
  }

  message ClientUdpPorts {
   int32 video_port = 1;
   int32 audio_port = 2;
  }
  VideoParams video_params = 1;
  AudioParams audio_params = 2;
  string connection_id = 3;
  ClientUdpPorts client_ports = 4;
}

message CreateUDPConnectionReply {
  int32 video_port = 1;
  int32 audio_port = 2;
  string host_address = 3;
}

Once the renderer receives the audio/video ports and IP address from the downstream microservice, it will then configure the RTP streaming component accordingly with the negotiated ports and start streaming the rendered scene with RTP over UDP to the specified host address.