Quickstart Guide ====================== |NVIDIA®| DeepStream Software Development Kit (SDK) is an accelerated AI framework to build intelligent video analytics (IVA) pipelines. DeepStream runs on |NVIDIA®| T4, |NVIDIA®| Ampere and platforms such as |NVIDIA®| Jetson™ Nano, |NVIDIA®| Jetson AGX Xavier™, |NVIDIA®| Jetson Xavier NX™, |NVIDIA®| Jetson™ TX1 and TX2. .. |NVIDIA®| replace:: NVIDIA\ :sup:`®` Jetson Setup -------------- This section explains how to prepare a Jetson device before installing the DeepStream SDK. Install Jetson SDK components ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Download NVIDIA SDK Manager from https://developer.nvidia.com/embedded/jetpack. You will use this to install JetPack 4.5.1 GA (corresponding to L4T 32.5.1 release) - NVIDIA SDK Manager is a graphical application which flashes and installs the JetPack packages. - The flashing procedure takes approximately 10-30 minutes, depending on the host system. .. note:: If you are using Jetson Nano or Jetson Xavier NX developer kit, you can download the SD card image from https://developer.nvidia.com/embedded/jetpack. This comes packaged with CUDA, TensorRT and cuDNN. Install Dependencies ~~~~~~~~~~~~~~~~~~~~~~~~ Enter the following commands to install the prerequisite packages: :: $ sudo apt install \ libssl1.0.0 \ libgstreamer1.0-0 \ gstreamer1.0-tools \ gstreamer1.0-plugins-good \ gstreamer1.0-plugins-bad \ gstreamer1.0-plugins-ugly \ gstreamer1.0-libav \ libgstrtspserver-1.0-0 \ libjansson4=2.11-1 Install librdkafka (to enable Kafka protocol adaptor for message broker) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 1. Clone the `librdkafka` repository from GitHub: :: $ git clone https://github.com/edenhill/librdkafka.git 2. Configure and build the library: :: $ cd librdkafka $ git reset --hard 7101c2310341ab3f4675fc565f64f0967e135a6a ./configure $ make $ sudo make install 3. Copy the generated libraries to the deepstream directory: :: $ sudo mkdir -p /opt/nvidia/deepstream/deepstream-5.1/lib $ sudo cp /usr/local/lib/librdkafka* /opt/nvidia/deepstream/deepstream-5.1/lib Install the DeepStream SDK ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - **Method 1**: Using SDK Manager Select DeepStreamSDK from the ``Additional SDKs`` section along with JP 4.5.1 software components for installation. - **Method 2**: Using the DeepStream tar package 1. Download the DeepStream 5.1 Jetson tar package ``deepstream_sdk_v5.1.0_jetson.tbz2``, to the Jetson device. 2. Enter the following commands to extract and install the DeepStream SDK: :: $ sudo tar -xvf deepstream_sdk_v5.1.0_jetson.tbz2 -C / $ cd /opt/nvidia/deepstream/deepstream-5.1 $ sudo ./install.sh $ sudo ldconfig - **Method 3**: Using the DeepStream Debian package Download the DeepStream 5.1 Jetson Debian package ``deepstream-5.1_5.1.0-1_arm64.deb``, to the Jetson device. Then enter the command: :: $ sudo apt-get install ./deepstream-5.1_5.1.0-1_arm64.deb .. note:: If you install the DeepStream SDK Debian package using the ``dpkg`` command, you must install the following packages before installing the debian package: * ``libgstrtspserver-1.0-0`` * ``libgstreamer-plugins-base1.0-dev`` - **Method 4**: Using the `apt-server` 1. Open the apt source configuration file in a text editor, using a command similar to ``$ sudo vi /etc/apt/sources.list.d/nvidia-l4t-apt-source.list`` 2. Change the repository name and download URL in the deb commands shown below: ``deb https://repo.download.nvidia.com/jetson/common r32.5 main`` 3. Save and close the source configuration file. 4. Enter the commands: :: $ sudo apt update $ sudo apt install deepstream-5.1 - **Method 5**: Use Docker container DeepStream docker containers are available on NGC. See the :doc:`\DS_docker_containers` section to learn about developing and deploying DeepStream using docker containers. Run deepstream-app (the reference application) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 1. Navigate to the samples directory on the development kit. 2. Enter the following command to run the reference application: :: $ deepstream-app -c Where ```` is the pathname of one of the reference application’s configuration files, found in ``configs/deepstream-app/``. See Package Contents for a list of the available files. .. note:: * You can find sample configuration files under ``/opt/nvidia/deepstream/deepstream-5.1/samples`` directory. Enter this command to see application usage: :: $ deepstream-app --help * To save TensorRT Engine/Plan file, run the following command: :: $ sudo deepstream-app -c 3. To show labels in 2D Tiled display view, expand the source of interest with mouse `left-click` on the source. To return to the tiled display, `right-click` anywhere in the window. 4. Keyboard selection of source is also supported. On the console where application is running, press the ``z`` key followed by the desired row index (0 to 9), then the column index (0 to 9) to expand the source. To restore 2D Tiled display view, press ``z`` again. Boost the clocks ^^^^^^^^^^^^^^^^^ After you have installed DeepStream SDK, run these commands on the Jetson device to boost the clocks: :: $ sudo nvpmodel -m 0 $ sudo jetson_clocks Run precompiled sample applications ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 1. Navigate to the chosen application directory inside ``sources/apps/sample_apps``. 2. Follow the directory’s README file to run the application. .. note:: If the application encounters errors and cannot create Gst elements, remove the GStreamer cache, then try again. To remove the GStreamer cache, enter this command: ``$ rm ${HOME}/.cache/gstreamer-1.0/registry.aarch64.bin`` When the application is run for a model which does not have an existing engine file, it may take up to a few minutes (depending on the platform and the model) for the file generation and the application launch. For later runs, these generated engine files can be reused for faster loading. dGPU Setup for Ubuntu ------------------------- This section explains how to prepare an ``Ubuntu x86_64`` system with NVIDIA dGPU devices before installing the DeepStream SDK. .. note:: This document uses the term dGPU (“discrete GPU”) to refer to NVIDIA GPU expansion card products such as NVIDIA |Tesla®| T4 and P4, NVIDIA |GeForce®| GTX 1080, and NVIDIA |GeForce®| RTX 2080. This version of DeepStream SDK runs on specific dGPU products on x86_64 platforms supported by NVIDIA driver 460.32 and NVIDIA TensorRT™ 7.2 and later versions. .. |Tesla®| replace:: Tesla\ :sup:`®` .. |GeForce®| replace:: GeForce\ :sup:`®` You must install the following components: - Ubuntu 18.04 - GStreamer 1.14.1 - NVIDIA driver 460.32 - CUDA 11.1 - TensorRT 7.2.X Remove all previous DeepStream installations ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Enter the following commands to remove all previous DeepStream 3.0 or prior installations: :: $ sudo rm -rf /usr/local/deepstream /usr/lib/x86_64-linux-gnu/gstreamer-1.0/libgstnv* /usr/bin/deepstream* /usr/lib/x86_64-linux-gnu/gstreamer-1.0/libnvdsgst* /usr/lib/x86_64-linux-gnu/gstreamer-1.0/deepstream* /opt/nvidia/deepstream/deepstream* $ sudo rm -rf /usr/lib/x86_64-linux-gnu/libv41/plugins/libcuvidv4l2_plugin.so To remove DeepStream 4.0 or later installations: 1. Open the ``uninstall.sh`` file in ``/opt/nvidia/deepstream/deepstream/`` 2. Set PREV_DS_VER as 4.0 3. Run the following script as ``sudo``: ``./uninstall.sh`` Install Dependencies ~~~~~~~~~~~~~~~~~~~~~~ Enter the following commands to install the necessary packages before installing the DeepStream SDK: :: $ sudo apt install \ libssl1.0.0 \ libgstreamer1.0-0 \ gstreamer1.0-tools \ gstreamer1.0-plugins-good \ gstreamer1.0-plugins-bad \ gstreamer1.0-plugins-ugly \ gstreamer1.0-libav \ libgstrtspserver-1.0-0 \ libjansson4 Install NVIDIA driver 460.32 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - Download and install NVIDIA driver 460.32 from NVIDIA Unix drivers page at: https://www.nvidia.com/download/driverResults.aspx/168347/en-us - Run the following commands: :: $chmod 755 NVIDIA-Linux-x86_64-460.32.03.run $./ NVIDIA-Linux-x86_64-460.32.03.run Install CUDA Toolkit 11.1 ^^^^^^^^^^^^^^^^^^^^^^^^^^^ Download and install CUDA Toolkit 11.1 from NVIDIA Developer Center at: https://developer.nvidia.com/cuda-downloads Install TensorRT 7.2.X ^^^^^^^^^^^^^^^^^^^^^^^ Download and install TensorRT 7.2 from the NVIDIA Developer Center: https://developer.nvidia.com/nvidia-tensorrt-download Install librdkafka (to enable Kafka protocol adaptor for message broker) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 1. Clone the `librdkafka` repository from GitHub: :: $ git clone https://github.com/edenhill/librdkafka.git 2. Configure and build the library: :: $ cd librdkafka $ git reset --hard 7101c2310341ab3f4675fc565f64f0967e135a6a ./configure $ make $ sudo make install 3. Copy the generated libraries to the deepstream directory: :: $ sudo mkdir -p /opt/nvidia/deepstream/deepstream-5.1/lib $ sudo cp /usr/local/lib/librdkafka* /opt/nvidia/deepstream/deepstream-5.1/lib Install the DeepStream SDK ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - **Method 1**: Using the DeepStream Debian package Download the DeepStream 5.1 dGPU Debian package ``deepstream-5.1_5.1.0-1_amd64.deb`` : https://developer.nvidia.com/deepstream-51-510-1-amd64deb Enter the command: :: $ sudo apt-get install ./deepstream-5.1_5.1.0-1_amd64.deb - **Method 2**: Download the DeepStream tar package: https://developer.nvidia.com/deepstream-sdk-v510-x86-64tbz2 Navigate to the location of the downloaded DeepStream package to extract and install the DeepStream SDK: :: $ sudo tar -xvf deepstream_sdk_v5.1.0_x86_64.tbz2 -C / $ cd /opt/nvidia/deepstream/deepstream-5.1/ $ sudo ./install.sh $ sudo ldconfig - **Method 3**: Using the apt-server 1. Add the following x86_64 repository in the source list of the machine. a. Open the apt source configuration file in a text editor, for example: ``$ sudo vi /etc/apt/sources.list.d/nvidia-apt-source.list`` b. Confirm repository name and download URL in the deb commands: ``deb https://repo.download.nvidia.com/jetson/x86_64 bionic r32.5`` c. Save and close the source configuration file. 2. Install the public key of the x86_64 repository of the APT server: :: $ sudo apt-key adv --fetch-key https://repo.download.nvidia.com/jetson/jetson-ota-public.asc Enter the commands: :: $ sudo apt update $ sudo apt install deepstream-5.1 - **Method 4**: Use Docker container DeepStream docker containers are available on NGC. See the :doc:`\DS_docker_containers` section to learn about developing and deploying DeepStream using docker containers. Run the deepstream-app (the reference application) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - Go to the samples directory and enter this command: :: $ deepstream-app -c Where ```` is the pathname of one of the reference application’s configuration files, found in ``configs/deepstream-app``. See Package Contents for a list of the available files. .. note:: * To dump engine file, run the following command: :: $ sudo deepstream-app -c * You can find sample configuration files under ``/opt/nvidia/deepstream/deepstream-5.1/samples`` directory. Enter this command to see application usage: :: $ deepstream-app --help - To show labels in 2D tiled display view, expand the source of interest with a mouse `left-click` on the source. To return to the tiled display, `right-click` anywhere in the window. - Keyboard selection of source is also supported. On the console where application is running, press the ``z`` key followed by the desired row index (0 to 9), then the column index (0 to 9) to expand the source. To restore the 2D Tiled display view, press ``z`` again. Run precompiled sample applications ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 1. Navigate to the chosen application directory inside ``sources/apps/sample_apps``. 2. Follow that directory’s README file to run the application. .. note:: If the application encounters errors and cannot create Gst elements, remove the GStreamer cache, then try again. To remove the GStreamer cache, enter this command: :: $ rm ${HOME}/.cache/gstreamer-1.0/registry.x86_64.bin When the application is run for a model which does not have an existing engine file, it may take up to a few minutes (depending on the platform and the model) for the file generation and application launch. For later runs, these generated engine files can be reused for faster loading. dGPU Setup for RedHat Enterprise Linux (RHEL) ---------------------------------------------- This section explains how to prepare an RHEL system with NVIDIA dGPU devices before installing the DeepStream SDK. .. note:: This document uses the term dGPU (“discrete GPU”) to refer to NVIDIA GPU expansion card products such as NVIDIA Tesla T4 and P4, NVIDIA GeForce GTX 1080, and NVIDIA GeForce RTX 2080. This version of DeepStream SDK runs on specific dGPU products on x86_64 platforms supported by NVIDIA driver 460+ and |NVIDIA®| TensorRT™ 7.2 and later versions. .. |NVIDIA®| replace:: NVIDIA\ :sup:`®` You must install the following components: - RHEL 8.x - GStreamer 1.14.1 - NVIDIA driver 460.32 (https://www.nvidia.com/download/driverResults.aspx/168347/en-us) - CUDA 11.1 - TensorRT 7.2.X Remove all previous DeepStream installations ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ To remove DeepStream 4.0 or later installations: 1. Open the uninstall.sh file which will be present in ``/opt/nvidia/deepstream/deepstream/`` 2. Set ``PREV_DS_VER`` as ``4.0`` 3. Run the following script as sudo: ``./uninstall.sh`` Install Dependencies ~~~~~~~~~~~~~~~~~~~~~~ Enter the following command to install the necessary packages before installing the DeepStream SDK: :: $ yum install \ gstreamer1 \ gstreamer1-plugins-base \ gstreamer1-plugins-good \ gstreamer1-plugins-bad-free \ gstreamer1-plugins-ugly-free \ gstreamer1-rtsp-server \ gstreamer1-svt-av1 \ json-glib \ openssl \ libuuid \ gstreamer1-plugins-base-devel \ json-glib-devel \ opencv-devel \ jansson-devel \ openssl-devel \ libuuid-devel **gst-rtsp-server**: ``gst-rtsp-server-devel`` package is not available for RHEL which is required to compile deepstream-app. Download sources from https://gstreamer.freedesktop.org/src/gst-rtsp-server/gst-rtsp-server-1.14.5.tar.xz: :: $ ./configure $ make $ make install $ sudo cp -r /usr/local/include/gstreamer-1.0/gst/rtsp-server/ /usr/include/gstreamer-1.0/gst/ .. note:: Packages required for RHEL 8.x are also mentioned in ``README.rhel`` in DeepStream package. Install NVIDIA driver 460.32 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - Download and install NVIDIA driver 460.32 from NVIDIA unix drivers page at https://www.nvidia.com/download/driverResults.aspx/168347/en-us - Run the following commands: :: $chmod 755 NVIDIA-Linux-x86_64-460.32.03.run $./ NVIDIA-Linux-x86_64-460.32.03.run Install CUDA Toolkit 11.1 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Download and install CUDA Toolkit 11.1 from NVIDIA Developer Center at: https://developer.nvidia.com/cuda-downloads Install TensorRT 7.2.X ^^^^^^^^^^^^^^^^^^^^^^ Download and install TensorRT 7.2.X from the NVIDIA Developer Center: https://developer.nvidia.com/nvidia-tensorrt-download Install librdkafka (to enable Kafka protocol adaptor for message broker) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 1. Clone the `librdkafka` repository from GitHub: ``$ git clone https://github.com/edenhill/librdkafka.git`` 2. Configure and build the library: :: $ cd librdkafka $ git reset --hard 7101c2310341ab3f4675fc565f64f0967e135a6a ./configure $ make $ sudo make install 3. Copy the generated libraries to the deepstream directory: :: $ sudo mkdir -p /opt/nvidia/deepstream/deepstream-5.1/lib $ sudo cp /usr/local/lib/librdkafka* /opt/nvidia/deepstream/deepstream-5.1/lib Install the DeepStream SDK ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - **Method 1**: Download the DeepStream tar package: https://developer.nvidia.com/deepstream-sdk-v510-x86-64tbz2 Navigate to the location to which the DeepStream package was downloaded and extract and install the DeepStream SDK: :: $ sudo tar -xvf deepstream_sdk_v5.1.0_x86_64.tbz2 -C / $ cd /opt/nvidia/deepstream/deepstream-5.1/ $ sudo ./install.sh $ sudo ldconfig - **Method 2**: Use Docker container DeepStream docker containers are available on NGC. See the :doc:`\DS_docker_containers` section to learn about developing and deploying DeepStream using docker containers. Run the deepstream-app (the reference application) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - Go to the samples directory and enter this command: :: $ deepstream-app -c Where ```` is the pathname of one of the reference application’s configuration files, found in ``configs/deepstream-app``. See Package Contents for a list of the available files. .. note:: • To dump engine file run the following command: :: $ sudo deepstream-app -c • You can find sample configuration files in the ``/opt/nvidia/deepstream/deepstream-5.1/samples`` directory. Enter this command to see application usage: :: $ deepstream-app --help - To show labels in 2D tiled display view, expand the source of interest with a mouse `left-click` on the source. To return to the tiled display, `right-click` anywhere in the window. - Keyboard selection of source is also supported. On the console where application is running, press the ``z`` key followed by the desired row index (0 to 9), then the column index (0 to 9) to expand the source. To restore the 2D Tiled display view, press ``z`` again. Run precompiled sample applications ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 1. Navigate to the chosen application directory inside ``sources/apps/sample_apps``. 2. Follow that directory’s README file to run the application. .. note:: If the application encounters errors and cannot create Gst elements, remove the GStreamer cache, then try again. To remove the GStreamer cache, enter this command: :: $ rm ${HOME}/.cache/gstreamer-1.0/registry.x86_64.bin When the application is run for a model which does not have an existing engine file, it may take up to a few minutes (depending on the platform and the model) for the file generation and application launch. For later runs, these generated engine files can be reused for faster loading. Running without an X server ------------------------------ The default configuration files provided with the SDK have the EGL based ``nveglglessink`` as the default renderer (indicated by type=2 in the [sink] groups). The renderer requires a running X server and fails without one. In case of absence of an X server, DeepStream reference applications provide an alternate functionality of streaming the output over RTSP. This can be enabled by adding an RTSP out sink group in the configuration file. Refer to ``[sink2]`` group in ``source30_1080p_dec_infer-resnet_tiled_display_int8.txt`` file for an example. Don’t forget to disable the ``nveglglessink`` renderer by setting enable=0 for the corresponding sink group. Platform and OS Compatibility ------------------------------- The following table provides information about platform and operating system compatibility in the current and earlier versions of DeepStream. .. csv-table:: Jetson model Platform and OS Compatibility :file: ../text/tables/DS_quickstart_Jetson_platform_os_compatibility.csv :widths: 10, 10, 10, 10, 10, 10, 10, 10 :header-rows: 1 .. csv-table:: dGPU model Platform and OS Compatibility :file: ../text/tables/DS_quickstart_dGPU_platform_os_compatibility.csv :widths: 10, 10, 10, 10, 10, 10, 10, 10 :header-rows: 1 DeepStream Triton Inference Server Usage Guidelines ---------------------------------------------------- dGPU ~~~~~~ 1. Pull the DeepStream Triton Inference Server docker :: docker pull nvcr.io/nvidia/deepstream:5.1-21.02-triton 2. Start the docker :: docker run --gpus all -it --rm -v /tmp/.X11-unix:/tmp/.X11-unix -e DISPLAY=$DISPLAY nvcr.io/nvidia/deepstream:5.1-21.02-triton Jetson ~~~~~~~ The Triton Inference Server shared libraries come pre-installed as part of DeepStream on Jetson. No extra steps are required for installing the Triton Inference Server. For both the platforms, to run the samples follow the steps in the `Running the Triton Inference Server samples` section of the README at: ``/opt/nvidia/deepstream/deepstream-5.1``