Quick Start Guide

The ‘Quick Start’ is meant to provide an introduction to using platform services and the reference application in a short amount of time with minimal hardware. Refer to AI NVR for a description of extending the ‘Quick Start’ deployment to build a performant, mature AI application in the form of a Network Video Recorder.

Preparation

Before you get started, go through following to acquire the necessary hardware components and get access to the software.

Required Hardware

  • Jetson Orin AGX devkit or

  • Jetson Orin NX 16GB devkit (self built) with 128GB(min) NVMe drive

  • Ubuntu 20.04 or 22.04 Desktop/Laptop

  • USB-C flashing cable

  • Monitor, Keyboard, Mouse (for Jetson)

Apply NGC access through devzone; setup NGC account

Go to Metropolis Microservices. Apply through the devzone portal for an NGC account. After receiving invitation email from NGC, follow instructions to join the ‘release’ team in ‘moj’ org.

Retrieve NGC API key

Generate & Retrieve NGC API key by following instructions in NGC User Guide (see “Generating Your NGC API Key”). The API key will be used for downloading docker images to your device.

Steps are summarized here for convenience:

Login to www.ngc.nvidia.com. Click your name in top right corner > Setup > Generate API key. Copy NGC API key that is displayed and save it for future reference

Hardware Setup

Connect monitor, keyboard & mouse to Jetson

Connect the monitor to the DP port, or using a DP-TO-HDMI dongle to connect using HDMI. Attach the keyboard and mouse to any free USB ports on the device.

Connect Jetson to Host

Connect the host (Ubuntu Desktop/Laptop) to the Jetson devkit USB-C flashing port using the USB cable.

Software Setup

Install BSP R36.2

The first step is to flash the device with BSP (Jetson Linux OS) Release 36.2. This can be done by downloading the image and flashing the device manually using the flashing scripts, or via the SDK Manager graphical interface. Both options are described below.

Manual

Detailed flashing instructions are available in the Jetson Linux Developer Guide. The steps are summarized below for quick reference.

Download BSP image

Download the following two packages from the Jetson Linux repository:

Jetson_Linux_R36.2.0_aarch64.tbz2.

Tegra_Linux_Sample-Root-Filesystem_R36.2.0_aarch64.tbz2.

Extract the Jetson-Linux BSP and root file-system

tar xf Jetson_Linux_R36.2.0_aarch64.tbz2

sudo tar xpf Tegra_Linux_Sample-Root-Filesystem_R36.2.0_aarch64.tbz2 -C Linux_for_Tegra/rootfs/

Execute the apply_binaries.sh script

cd Linux_for_Tegra/

sudo ./apply_binaries.sh

Install necessary packages on host

sudo ./tools/l4t_flash_prerequisites.sh

Configure username & password for device login

sudo ./tools/l4t_create_default_user.sh -u <username> -p <password> -a

Put device into recovery mode

Orin AGX: Follow instructions provided on this AGX page.

Orin NX16: Follow instructions provided on this NX page.

Confirm device in recovery mode

lsusb

Orin AGX: You should see output that contains the string “ID 0955:7023 NVIDIA”

Orin NX16: You should see output that contains the string “ID 0955:7323 NVIDIA”

Flash Image

Orin AGX:

sudo ./flash.sh jetson-agx-orin-devkit internal

Orin NX16:

sudo ./tools/kernel_flash/l4t_initrd_flash.sh --external-device nvme0n1p1 -p "-c ./bootloader/generic/cfg/flash_t234_qspi.xml" -c ./tools/kernel_flash/flash_l4t_t234_nvme.xml --showlogs --network usb0 jetson-orin-nano-devkit nvme0n1p1

SDK Manager

Install and Launch SDKM

Download the latest version from SDK Manager and install on your host (Ubuntu Desktop/Laptop). Then launch it with the command:

sdkmanager

Install BSP

Follow the instructions to install the BSP by selecting “Jetpack 6.0 DP” in Step 01 and “Jetson Linux” in Step 02. The other components in Step 02 can be deselected, as not needed.

Install Platform Services

Install Platform Services package

On your Jetson device, apt install the latest platform services Debian package from the Jetson apt repository. This will install all platform services including any dependencies.

sudo apt update

sudo apt install nvidia-jetson-services

Update settings for performance

If you would like to get best performance from your device, you can optionally set it for max power and clock speed as follows:

sudo nvpmodel -m 0 (reboot required)

sudo jetson_clocks

If you would like to maximize the number of video streams processed by the system, specially for AI NVR application, run the following commands. This tweaks kernel parameters setting sizes of receive buffers used by network sockets receiving video data over RTSP.

sudo sysctl -w net.core.rmem_default=2129920

sudo sysctl -w net.core.rmem_max=10000000

Install Application Bundle

Download and Extract bundle

Download the application bundle from NGC to your Jetson using this NGC link. You will need to enter your NGC credentials. On the page, use one of the options available in the Download menu (top right).

Extract the files on your Jetson using the following commands:

unzip files.zip [if used direct download option]

tar -xvf ai_nvr.tar.gz

Copy config files

sudo cp ai_nvr/config/ai-nvr-nginx.conf /opt/nvidia/jetson/services/ingress/config/

Set VST video storage

Set the value for parameter “total_video_storage_size_MB” in the file ai_nvr/config/vst/vst_storage.json to a suitable size based on available space in your drive (you can use the command df -h to check how much space is available in your root file system). For example, use a value of 10000 to set a 10GB limit. For details about VST configuration, refer to the Storage Config.

Install NVStreamer App

NVStreamer is an NVIDIA developer software that enables storing and serving of video files that can be streamed using the RTSP protocol. It can serve as an alternative means to cameras for creating video sources as inputs for the Video Storage Toolkit (VST) microservice. Setup NVStreamer on your host (Ubuntu Desktop/Laptop) using instructions in NVStreamer on Jetson Orin.

Download a sample video file to be streamed with NVStreamer from NGC using this link. You will need to enter your NGC credentials. On the page, use one of the options available in the Download menu (top right).

Follow instructions in the Uploading Videos to NVStreamer section to upload the sample video into NVStreamer.

Running Hello World

Run IVA Application

To run a sample Intelligent Video Analytics (IVA) application, follow below steps.

Start Services

Login to nvcr.io using your NGC API key. This is needed to access the service and application containers hosted on NGC.

sudo docker login nvcr.io -u "\$oauthtoken" -p <NGC-API-KEY>

For this sample, we only need the Redis & Ingress services, which can be started with:

sudo systemctl start jetson-redis

sudo systemctl start jetson-ingress

Note

The first time may take little while to complete since the container images need to be downloaded to the device from NGC.

The services may be stopped using the commands (or by reboot):

sudo systemctl stop jetson-redis

sudo systemctl stop jetson-ingress

Start Application

Launch the application from the downloaded bundle. Note that docker compose launch command depends on the device it is running on:

cd ai_nvr

If on Orin AGX: sudo docker compose -f compose_agx.yaml up -d --force-recreate

If on Orin NX16: sudo docker compose -f compose_nx.yaml up -d --force-recreate

Ensure that containers are running as expected by running the sudo docker ps command. Sample output shown below:

CONTAINER ID   IMAGE                                                        COMMAND                  CREATED          STATUS              PORTS     NAMES
95d55ac2dbf1   nvcr.io/e7ep4mig3lne/release/sdr:mmj_v1                      "sh -c '/wdm/dist/sd…"   2 minutes ago    Up About a minute             sdr-emdx
652f5e29c412   nvcr.io/e7ep4mig3lne/release/deepstream:mmj_v1               "/opt/nvidia/nvidia_…"   2 minutes ago    Up About a minute             deepstream
61f0d9e90e0c   nvcr.io/e7ep4mig3lne/release/vst:v1.2.37_aarch64             "sh -c '/root/vst_re…"   2 minutes ago    Up About a minute             vst
f6aeab627b70   nvcr.io/e7ep4mig3lne/release/emdx-analytics:mmj_v1           "sh -c 'gunicorn --w…"   2 minutes ago    Up About a minute             emdx-analytics-02
37282da57a44   nvcr.io/e7ep4mig3lne/release/sdr:mmj_v1                      "sh -c '/wdm/dist/sd…"   2 minutes ago    Up About a minute             sdr
1c6023111e53   nvcr.io/e7ep4mig3lne/release/emdx-analytics:mmj_v1           "sh -c 'gunicorn --w…"   2 minutes ago    Up About a minute             emdx-analytics-01
535e9aecc104   nvcr.io/e7ep4mig3lne/release/emdx-analytics-web-api:mmj_v1   "sh -c 'gunicorn --w…"   2 minutes ago    Up 2 minutes                  emdx-webapi
3cac3e230392   nvcr.io/e7ep4mig3lne/release/ialpha-ingress-arm64v8:0.8      "sh -c '/nginx.sh 2>…"   36 minutes ago   Up 36 minutes                 ingress
39121525647d   redisfab/redistimeseries:master-arm64v8-jammy                "docker-entrypoint.s…"   38 minutes ago   Up 38 minutes                 redis

The application may be stopped using the command:

cd ai_nvr

If on Orin AGX: sudo docker compose -f compose_agx.yaml down --remove-orphans

If on Orin NX16: sudo docker compose -f compose_nx.yaml down --remove-orphans

Add NVStreamer RTSP stream to VST

Follow instructions in the NVStreamer to add the stream to VST. See Adding RTSP Stream to VST section in Overview for the details.

Ensure that the stream can be viewed from the VST Reference Web App through the Live Streams tab. See Reference Web App for more details.

Also make sure that these streams were correctly added to DeepStream and that they are being processed. To do so, view the DeepStream logs and make sure the streams that were added are visible there with an fps value of greater than 0. Ideally this will be near 30fps, but may be lower depending on the input video fps. See Logs for more details.

View video overlay & analytics

Processed video output can be viewed as an RTSP stream accessible at rtsp://<JETSON-DEVICE-IP>:8555/ds-test and rtsp://<JETSON-DEVICE-IP>:8556/ds-test depending on the pipeline the input stream was placed in. Use a media player such as VLC to open and view the stream. Alternatively, the stream can be added to VST as a new stream to be able to view via the VST web UI. If adding to VST, ensure that the name includes the word “overlay”, otherwise SDR will not ignore this stream and add it back to DeepStream causing a circular dependency.

DeepStream is used to create an output stream showing live input streams added via VST and shows them with bounding boxes around people. If gems (Region of Interest and Tripwire) are set, up to one of each type will also be shown on each overlay tile. As streams are added and removed from VST, the same will be reflected in the overlay stream. By default up to 6 streams (3 on each pipeline) are supported on Orin NX16 and up to 16 (8 on each pipeline) are supported on Orin AGX.

See DeepStream for more information on the overlay stream and DeepStream.

A sample screenshot from the overlay stream can be seen below:

../_images/moj_overlay_screenshot.png

Supported Stream Count

Number of streams (cameras) supported by an Metropolis Microservices based application depends on a variety of factors including:

  • Hardware platform used (Orin AGX versus NX)

  • Model used: PeopleNet 2.6 is an unpruned model offering superior accuracy at the expense of higher resource utilization. Model architecture, use of pruning and quantization techniques impact utilization and hence supported stream count

  • DeepStream configuration: configuration attributes impacting resource utilization include inference interval, type of tracker used, streammux/inference/tracker intervals

  • Stream resolution and encoding: these determine decoder limits as H.265 supports larger decode limit than H.265 (26 versus 14 for 1080p resolution)

  • Number of webRTC concurrent stream: webRTC streaming consumes memory, and for the devices with smaller available RAM the memory consumed at higher stream counts can be a bottleneck

Ultimately, processing more streams than the system supports results in saturation of potentially any of system resources including GPU, DLA, RAM, memory bandwidth, decode/encoder utilization. Inputting more streams than system can support will result in drop in FPS of underlying DeepStream pipeline to under real-time performance (typically 30 fps), thereby the system dropping frames. Utilization of these resources can be identified by use of the tegrastats utility.

The AI-NVR reference application showcased in this quickstart guide can support stream count as below based on using PeopleNet 2.6 model using DLA based inference and NVDCF tracker with accuracy config running on PVA.

  • 16 H.265 streams on Orin AGX developer kit (with 64GB RAM)

  • 6 H.265 streams on Orin NX16

These numbers are based on inference running on DLA, and tracker on combination of PVA and GPU - thereby sparing GPU for running additional AI.

In addition to the above, we were able to launch webRTC streaming for 2 streams on both Orin AGX and Orin NX with good streaming quality.

One aspect to note is that one instance of the hardware decoder is each used by:

  • DeepStream processing each video stream from VST

  • Every webRTC stream instance launched between mobile app (MMJ client) and the MMJ device

So the sum of the above has to be within the supported decoder stream count. Developers could apportion the number between the two usages as needed.

The utilization metrics for various system resources on Orin AGX is show in the table below.

Utilization Metrics

Platform

CPU

RAM

GPU

VIC

DLA0/1 (avg)

PVA

Orin AGX 64GB

53%

31,158 MB

42%

87%

78%

34%

Orin NX16

61%

8,325 MB

28%

55%

33%

26%

Note the remaining available GPU enabling simultaneous execution of other AI workloads.

Use the tegrastats utility to monitor if any of the system resources are approaching full utilization for your scenarios, which would lead to drop in FPS and streaming quality.