Server Details

Input images to the server node must be of type ImageProto with RGB format. Internally, the image is converted to a different format and stored in a different memory type depending on the platform:

Platform ImageProto Format Converted Format Memory Type
x86_64 RGB BGRA CUDA
L4T RGB NV12 NvMedia

The images can be sent on any or all of the four color[0..3] RX channels. They will be sent on the NVRC video streams in order. For example, if the color1 and color3 channels are connected, then color1 images will be sent on stream index 0, while color3 images will be sent on stream index 1. The lowest color channel connected is assumed to be the primary color channel, where only timestamps from these images are used, and images arriving on the other streams use the latest timestamp from the primary color channel.

Images from all sources are required to have the same width and height, which are specified in the nvrcVideoFrame struct and sent using the nvrcServerSendVideo API.

The following resolutions are supported:

  • 640 x 480
  • 1280 x 720
  • 1920 x 1080

The component accepts the following types of data:

When received, these types of data will be added to a JSON document and sent out once per video-frame group. There are two options for when data is sent out:

  • If the sync_on_detections parameter is true, the latest data will be sent out when detections are received, with the timestamp corresponding to the video frame that the detections were for. As a result, detections received by the client should match the corresponding video frame received.
  • If the sync_on_detections parameter is false, the latest data will be sent out when the image is received. As a result, detections received by the client may be older than the video frame (synchronization will not be as good), but latency will be less.

The client receives synchronized frames, including a group of images (up to four, one per stream) and corresponding data tagged with a timestamp matching the image group.

This functionality can be extended to support other data types. Each new data type can be encoded in JSON and added to the data document sent. Note that the user client application will also need to be updated to parse the entire received data JSON document and handle the new data types.

The client expects to receive a data document with every video frame, so it should always be sent, even if empty.

The component also receives joystick controls (encoded in JSON) and outputs JoystickStateProto messages, which can be sent to control components in Isaac SDK.

© Copyright 2018-2020, NVIDIA Corporation. Last updated on Oct 30, 2023.