Logo
6.0 Release

DeepStream Getting Started

  • Welcome to the DeepStream Documentation
    • NVIDIA DeepStream Overview
      • DeepStream Graph Architecture
      • DeepStream reference app
      • Getting started with building apps
      • DeepStream in Python
  • Readme First
    • Additional Resources
      • Documentation
      • Forums
  • Quickstart Guide
    • Jetson Setup
      • Install Jetson SDK components
      • Install Dependencies
        • Install librdkafka (to enable Kafka protocol adaptor for message broker)
      • Install latest NVIDIA BSP packages
      • Install the DeepStream SDK
      • Run deepstream-app (the reference application)
        • Boost the clocks
      • Run precompiled sample applications
    • dGPU Setup for Ubuntu
      • Remove all previous DeepStream installations
      • Install Dependencies
        • Install NVIDIA driver 470.63.01
        • Install CUDA Toolkit 11.4.1 (CUDA 11.4 Update 1)
        • Install TensorRT 8.0.1
        • Install librdkafka (to enable Kafka protocol adaptor for message broker)
      • Install the DeepStream SDK
      • Run the deepstream-app (the reference application)
      • Run precompiled sample applications
    • dGPU Setup for RedHat Enterprise Linux (RHEL)
      • Remove all previous DeepStream installations
      • Install Dependencies
        • Install NVIDIA driver 470.63.01
        • Install CUDA Toolkit 11.4 (CUDA 11.4 Update 1)
        • Install TensorRT 8.0.1
        • Install librdkafka (to enable Kafka protocol adaptor for message broker)
      • Install the DeepStream SDK
      • Run the deepstream-app (the reference application)
      • Run precompiled sample applications
    • Running without an X server
    • Platform and OS Compatibility
    • DeepStream Triton Inference Server Usage Guidelines
      • dGPU
      • Jetson
    • Using DLA for inference
      • Separate processes
      • Single process
  • Docker Containers
    • A Docker Container for dGPU
    • A Docker Container for Jetson
    • Creating custom DeepStream docker for dGPU using DeepStreamSDK package
    • Creating custom DeepStream docker for Jetson using DeepStreamSDK package

DeepStream Samples

  • C/C++ Sample Apps Source Details
    • Plugin and Library Source Details
  • Python Sample Apps and Bindings Source Details
    • Sample Application Source Details
    • Python Bindings and Application Development
      • Prerequisites
      • Running Sample Applications
      • Pipeline Construction
    • MetaData Access
      • Memory Management
      • Allocations
      • String Access
        • Setting String Fields
        • Reading String Fields
      • Casting
      • Callback Function Registration
      • Optimizations and Utilities
      • Image Data Access
  • DeepStream Reference Application - deepstream-app
    • Application Architecture
    • Reference Application Configuration
      • Expected Output for the DeepStream Reference Application (deepstream-app)
    • Configuration Groups
      • Application Group
      • Tiled-display Group
      • Source Group
      • Streammux Group
      • Preprocess Group
      • Primary GIE and Secondary GIE Group
      • Tracker Group
      • Message Converter Group
      • Message Consumer Group
      • OSD Group
      • Sink Group
      • Tests Group
      • NvDs-analytics Group
    • Application Tuning for DeepStream SDK
      • Performance Optimization
        • DeepStream best practices
        • Jetson optimization
        • Triton
        • Inference Throughput
      • Reducing Spurious Detections
  • DeepStream Reference Application - deepstream-test5 app
    • IoT Protocols supported and cloud configuration
    • Message consumer
    • Smart Record - Event based recording
    • OTA model update
      • Using the OTA functionality
  • DeepStream Reference Application - deepstream-audio app
    • DeepStream Audio Reference Application Architecture and Sample Graphs
  • DeepStream Reference Application on GitHub
    • Use case applications
    • AI models with DeepStream
    • DeepStream features sample
  • Sample Configurations and Streams
    • Contents of the package
      • Scripts included along with package
  • Implementing a Custom GStreamer Plugin with OpenCV Integration Example
    • Description of the Sample Plugin: gst-dsexample
      • GstBaseTransfrom Class Functions
      • Other supporting functions
    • Enabling and configuring the sample plugin
    • Using the sample plugin in a custom application/pipeline
    • Implementing Custom Logic Within the Sample Plugin
    • Adding NVTX APIs for sample plugin
    • Accessing NvBufSurface memory in OpenCV

TAO toolkit Integration with DeepStream

  • TAO Toolkit Integration with DeepStream
    • Pre-trained models

Tutorials and How-to's

  • Custom YOLO Model in the DeepStream YOLO App
    • How to Use the Custom YOLO Model
      • Set up the sample
  • NvMultiObjectTracker Parameter Tuning Guide
    • Accuracy-Performance Tradeoffs
      • Visual Feature Types and Feature Sizes
      • Detection Interval
      • Video Frame Size for Tracker
    • Robustness
      • Target Creation Policy
      • Target Termination Policy
    • State Estimation
      • Kalman Filter
    • Data Association
    • DCF Core Tuning
      • DCF Filter Learning

DeepStream Performance

  • Performance
    • TAO Pre-trained models
    • DeepStream reference model and tracker
    • Configuration File Settings for Performance Measurement
    • DeepStream reference model
      • Data center GPU - GA100
        • System Configuration
        • Application Configuration
      • Data center GPU - T4
        • System Configuration
        • Application Configuration
      • Jetson
        • System Configuration
        • Jetson Nano
        • Jetson AGX Xavier
        • Jetson NX
        • Jetson TX2
        • Jetson TX1

DeepStream Custom Model

  • Using a Custom Model with DeepStream
    • Custom Model Implementation Interface
    • Custom Output Parsing
    • IPlugin Implementation
      • How to Use IPluginCreator
      • How to Use IPluginFactory
        • For Caffe Files
        • For Uff Files
        • During Deserialization
    • Input Layer Initialization
    • CUDA Engine Creation for Custom Models
    • IModelParser Interface for Custom Model Parsing

DeepStream Key Features

  • Smart Video Record
    • Smart Video Record Module APIs
    • Smart Video Record Configurations
  • IoT
    • Secure Edge-to-Cloud Messaging
      • 2-way TLS Authentication
        • Overview of Steps
        • TLS Version
        • Key generation
        • Certificate Signing
        • Choice of Cipher
        • Configure TLS options in Kafka config file for DeepStream
      • SASL/Plain
        • Overview of Steps
        • TLS Configuration
        • Credential Storage
        • Choosing Between 2-way TLS and SASL/Plain
      • Impact on performance
    • Bidirectional Messaging
      • Edge-to-Cloud
      • Cloud-to-Edge
      • NvMsgbroker Library
    • Autoreconnect feature
  • On the Fly Model Update
    • Assumptions
  • NTP Timestamp in DeepStream
  • AV Sync in DeepStream
    • Setup for RTMP/RTSP Input streams for testing
      • RTMP Server Setup
        • Command to simulate 2 RTMP streams using ffmpeg
      • RTSP Server Setup
    • AVSync Reference Pipelines
      • Pipelines with existing nvstreammux component
        • RTMP_IN -> RTMP_OUT
        • FILE_IN->RTSP_OUT
        • FILE_IN->RTMP_OUT
        • RTMP_IN->FILE_OUT
        • RTSP_IN->FILE_OUT
        • FILE_IN->FILE_OUT
        • RTSP_IN->RTSP_OUT
        • RTMP_IN->RTSP_OUT
      • Reference AVSync + ASR (Automatic Speech Recognition) Pipelines with existing nvstreammux
        • RTMP_IN->RTMP_OUT
        • RTSP_IN->RTMP_OUT
        • FILE_IN->RTMP_OUT
      • Pipelines with New nvstreammux component
        • RTMP_IN->RTMP_OUT
        • RTSP_IN->RTMP_OUT
        • FILE_IN->RTSP_OUT
        • FILE_IN->RTMP_OUT
        • RTMP_IN->FILE_OUT
        • RTSP_IN->FILE_OUT
        • FILE_IN->FILE_OUT
        • RTSP_IN->RTSP_OUT
        • RTMP_IN->RTSP_OUT
      • Reference AVSync + ASR Pipelines (with new nvstreammux)
        • RTMP_IN->RTMP_OUT
        • RTSP_IN->RTMP_OUT
        • FILE_IN->RTMP_OUT
      • Gst-pipeline with audiomuxer (single source, without ASR + new nvstreammux)
        • RTMP_IN->FILE_OUT
  • DeepStream 3D Action Recognition App
    • Getting Started
      • Prerequisites
      • Run 3D Action Recognition Examples
      • Run 2D Action Recognition Examples
    • DeepStream 3D Action Recognition App Configuration Specifications
      • deepstream-3d-action-recognition [action-recognition] group settings
      • Custom sequence preprocess lib user settings [user-configs] for gst-nvdspreprocess
      • Custom lib and `gst-nvdspreprocess` Settings for Action Recognition
    • Build Custom sequence preprocess lib and application From Source

DeepStream Application Migration

  • Application Migration to DeepStream 6.0 from DeepStream 5.X
    • Major Application Differences with DeepStream 5.X
    • Running DeepStream 5.X compiled Apps in DeepStream 6.0
    • Compiling DeepStream 5.1 Apps in DeepStream 6.0
    • Low-level Object Tracker Library Migration from DeepStream 5.1 Apps to DeepStream 6.0

DeepStream Plugin Guide

  • GStreamer Plugin Overview
  • MetaData in the DeepStream SDK
    • NvDsBatchMeta: Basic Metadata Structure
    • User/Custom Metadata Addition inside NvDsBatchMeta
    • Adding Custom Meta in Gst Plugins Upstream from Gst-nvstreammux
      • Adding metadata to the plugin before Gst-nvstreammux
      • New metadata fields
  • Gst-nvdspreprocess (Alpha)
    • Inputs and Outputs
    • Features
    • Custom library Interfaces
    • Gst-nvdspreprocess File Configuration Specifications
    • Gst Properties
    • Sample pipelines
  • Gst-nvinfer
    • Inputs and Outputs
    • Features
    • Gst-nvinfer File Configuration Specifications
    • Gst Properties
    • Clustering algorithms supported by nvinfer
      • cluster-mode = 0 | GroupRectangles
      • cluster-mode = 1 | DBSCAN
      • cluster-mode = 2 | NMS
      • cluster-mode = 3 | Hybrid
      • cluster-mode=4 | No clustering
    • Tensor Metadata
      • To read or parse inference raw tensor data of output layers
    • Segmentation Metadata
  • Gst-nvinferaudio
    • Inputs and Outputs
    • Features
    • Gst Properties
  • Gst-nvinferserver
    • Inputs and Outputs
    • Gst-nvinferserver File Configuration Specifications
    • Features
    • Gst Properties
    • Deepstream Triton samples
    • Deepstream Triton gRPC support
    • Triton Ensemble Models
    • Custom Process interface IInferCustomProcessor for Extra Input, LSTM Loop, Output Data Postprocess
    • Tensor Metadata Output for DownStream Plugins
      • To read or parse inference raw tensor data of output layers
    • Segmentation Metadata
  • Gst-nvtracker
    • Inputs and Outputs
    • Gst Properties
    • NvDsTracker API for Low-Level Tracker Library
    • NvMultiObjectTracker : A Reference Low-Level Tracker Library
    • Unified Tracker Architecture for Composable Multi-Object Tracker
    • Work Flow and Core Modules in The NvMultiObjectTracker Library
      • Data Association
      • Target Management and Error Handling
      • State Estimation
      • Motion-based Target Re-Association
      • Bounding-box Unclipping
      • Configuration Parameters
    • IOU Tracker
    • NvDCF Tracker
      • Visual Tracking
      • Data Association
      • Visualization of Sample Outputs and Correlation Responses
        • PeopleNet + NvDCF
        • DetectNet_v2 + NvDCF
        • DetectNet_v2 (w/ interval=2) + NvDCF
      • Configuration Parameters
    • DeepSORT Tracker (Alpha)
      • Re-ID
        • Re-ID Similarity Score
        • Setup Official Re-ID Model
      • Data Association
      • Customize Re-ID Model
      • Configuration Parameters
      • Implementation Details and Reference
    • Low-Level Tracker Comparisons and Tradeoffs
    • How to Implement a Custom Low-Level Tracker Library
  • Gst-nvstreammux
    • Inputs and Outputs
    • Features
    • Gst Properties
  • Gst-nvstreammux New (Beta)
    • Inputs and Outputs
    • Features
    • Gst Properties
    • Mux Config Properties
    • NvStreamMux Tuning Solutions for specific usecases
      • 1. Aim
      • 2. Important Tuning parameters
      • 3. Video + Audio muxing Usecases
        • 3.1 Video and Audio muxing; file sources of different fps
        • 3.2 Video and Audio muxing; RTMP/RTSP sources
      • 4 Troubleshooting
        • 4.1 GstAggregator plugin -> filesink does not write data into the file
        • 4.2 nvstreammux WARNING “Lot of buffers are being dropped”
    • Known Issues and FAQ
      • 1. Observing video and/or audio stutter (low framerate)
      • 2. Sink plugin shall not move asynchronously to PAUSED
      • 3. Heterogeneous batching
      • 4. Adaptive Batching
  • Gst-nvstreamdemux
    • Inputs and Outputs
      • Use case 1
      • Use case 2
      • Use case 3
  • Gst-nvmultistreamtiler
    • Inputs and Outputs
    • Features
    • Gst Properties
  • Gst-nvdsosd
    • Inputs and Outputs
    • Features
    • Gst Properties
  • Gst-nvdsvideotemplate
    • Inputs and Outputs
    • Features
    • customlib_impl Interfaces
    • Gst Properties
    • Sample pipelines
  • Gst-nvdsaudiotemplate
    • Inputs and Outputs
    • Features
    • customlib_impl Interfaces
    • Gst Properties
    • Sample pipelines
  • Gst-nvvideoconvert
    • Inputs and Outputs
    • Features
    • Gst Properties
  • Gst-nvdewarper
    • Inputs and Outputs
    • Features
    • Configuration File Parameters
    • Gst Properties
  • Gst-nvof
    • Inputs and Outputs
    • Features
    • Gst Properties
  • Gst-nvofvisual
    • Inputs and Outputs
    • Features
    • Gst Properties
  • Gst-nvsegvisual
    • Inputs and Outputs
    • Features
    • Gst Properties
  • Gst-nvvideo4linux2
    • Decoder
      • Inputs and Outputs
      • Features
      • Gst Properties
    • Encoder
      • Inputs and Outputs
      • Features
      • Gst Properties
  • Gst-nvjpegdec
    • Inputs and Outputs
    • Features
    • Gst Properties
  • Gst-nvmsgconv
    • Inputs and Outputs
    • Features
    • Gst Properties
    • Schema Customization
    • Payload with Custom Objects
  • Gst-nvmsgbroker
    • Inputs and Outputs
    • Features
    • Gst Properties
    • nvds_msgapi: Protocol Adapter Interface
      • nvds_msgapi_connect(): Create a Connection
      • nvds_msgapi_send() and nvds_msgapi_send_async(): Send an event
      • nvds_msgapi_subscribe(): Consume data by subscribing to topics
      • nvds_msgapi_do_work(): Incremental Execution of Adapter Logic
      • nvds_msgapi_disconnect(): Terminate a Connection
      • nvds_msgapi_getversion(): Get Version Number
      • nvds_msgapi_get_protocol_name(): Get name of the protocol
      • nvds_msgapi_connection_signature(): Get Connection signature
    • nvds_kafka_proto: Kafka Protocol Adapter
      • Installing Dependencies
      • Using the Adapter
      • Configuring Protocol Settings
      • Programmatic Integration
      • Security for Kafka
    • Azure MQTT Protocol Adapter Libraries
      • Installing Dependencies
      • Setting Up Azure IoT
      • Configuring Adapter Settings
      • Using the Adapter
        • Connection Details for the Device Client Adapter
        • Connection Details for the Module Client Adapter
      • Monitor Adapter Execution
        • Azure device client library log messages
        • Azure Module Client Library Log Messages
      • Message Topics and Routes
    • AMQP Protocol Adapter
      • Installing Dependencies
      • AMQP broker
      • Configure Adapter Settings
      • Using the Adapter
      • Programmatic Integration
      • Monitor Adapter Execution
    • REDIS Protocol Adapter
      • Installing Dependencies
      • REDIS server
      • Configure Adapter Settings
      • Using the Adapter
      • Programmatic Integration
      • Monitor Adapter Execution
    • nv_msgbroker: Message Broker interface
      • nv_msgbroker_connect(): Create a Connection
      • nv_msgbroker_send_async(): Send an event asynchronously
      • nv_msgbroker_subscribe(): Consume data by subscribing to topics
      • nv_msgbroker_disconnect(): Terminate a Connection
      • nv_msgbroker_version(): Get Version Number
      • Autoreconnect feature
      • nvds_logger: Logging Framework
        • Enabling Logging
        • Filtering Logs
        • Retiring and Managing Logs
        • Generating Logs
  • Gst-nvdsanalytics
    • Inputs and Outputs
    • Features
    • Gst Properties
    • Configuration File Parameters
  • Gst-nvdsasr
    • Inputs and Outputs
    • Features
    • DS-Riva ASR Yaml File Configuration Specifications
    • Gst Properties
      • Riva ASR model data generation and gRPC installation
  • Gst-nvds_text_to_speech (Alpha)
    • Inputs and Outputs
    • Features
    • DS-Riva TTS Yaml File Configuration Specifications
    • Gst Properties
      • Riva TTS Service Initiation
      • gRPC C++ Installation
    • Sample Application
  • Gst-nvdsudpsrc
    • Inputs and Outputs
    • Features
    • Gst Properties

DeepStream Troubleshooting and FAQ

  • Troubleshooting
    • You are migrating from DeepStream 5.x to DeepStream 6.0
    • “NvDsBatchMeta not found for input buffer” error while running DeepStream pipeline
    • The DeepStream reference application fails to launch, or any plugin fails to load
    • Application fails to run when the neural network is changed
    • The DeepStream application is running slowly (Jetson only)
    • The DeepStream application is running slowly
    • NVIDIA Jetson Nano™, deepstream-segmentation-test starts as expected, but crashes after a few minutes rebooting the system
    • Errors occur when deepstream-app is run with a number of streams greater than 100
    • Errors occur when deepstream-app fails to load plugin Gst-nvinferserver
    • Tensorflow models are running into OOM (Out-Of-Memory) problem
    • Memory usage keeps on increasing when the source is a long duration containerized files(e.g. mp4, mkv)
    • Stale frames observed on RTSP output
    • Troubleshooting in NvDCF Parameter Tuning
      • Flickering Bbox
      • Frequent tracking ID changes although no nearby objects
      • Frequent tracking ID switches to the nearby objects
      • Error while running ONNX / Explicit batch dimension networks
    • DeepStream plugins failing to load without DISPLAY variable set when launching DS dockers
    • Nvidia driver installation issues
      • 1. [When user expect to use Display window]
      • 2. [When user expect to not use a Display window]
    • Nvidia TensorRT installation issues
    • Graph Composer Troubleshooting
      • My component is not visible in the composer even after registering the extension with registry
      • My component is getting registered as an abstract type.
      • When executing a graph, the execution ends immediately with the warning “No system specified. Nothing to do”
  • Frequently Asked Questions
    • DeepStream General topics
      • How do I uninstall DeepStream?
      • What types of input streams does DeepStream 6.0 support?
      • What’s the throughput of H.264 and H.265 decode on dGPU (Tesla)?
      • How can I run the DeepStream sample application in debug mode?
      • Where can I find the DeepStream sample applications?
      • How can I verify that CUDA was installed correctly?
      • How can I interpret frames per second (FPS) display information on console?
      • My DeepStream performance is lower than expected. How can I determine the reason?
      • How can I specify RTSP streaming of DeepStream output?
      • What is the official DeepStream Docker image and where do I get it?
      • What is the recipe for creating my own Docker image?
      • How can I display graphical output remotely over VNC? How can I determine whether X11 is running?
      • Why does the deepstream-nvof-test application show the error message “Device Does NOT support Optical Flow Functionality” if run with NVIDIA Tesla P4 or NVIDIA Jetson Nano, Jetson TX2, or Jetson TX1?
      • Why is the Gst-nvstreammux plugin required in DeepStream 4.0+?
      • Why is a Gst-nvegltransform plugin required on a Jetson platform upstream from Gst-nveglglessink?
      • How do I profile DeepStream pipeline?
      • How can I check GPU and memory utilization on a dGPU system?
      • What is the approximate memory utilization for 1080p streams on dGPU?
      • When deepstream-app is run in loop on Jetson AGX Xavier using “while true; do deepstream-app -c <config_file>; done;”, after a few iterations I see low FPS for certain iterations. Why is that?
      • Why do I get the error Makefile:13: *** "CUDA_VER is not set".  Stop when I compile DeepStream sample applications?
      • How can I construct the DeepStream GStreamer pipeline?
      • The property bufapi-version is missing from nvv4l2decoder, what to do?
      • Why am I getting “ImportError: No module named google.protobuf.internal when running convert_to_uff.py on Jetson AGX Xavier”?
      • Does DeepStream Support 10 Bit Video streams?
      • What is the difference between batch-size of nvstreammux and nvinfer? What are the recommended values for nvstreammux batch-size?
      • Why do some caffemodels fail to build after upgrading to DeepStream 6.0?
      • How do I configure the pipeline to get NTP timestamps?
      • Why is the NTP timestamp value 0?
      • How to handle operations not supported by Triton Inference Server?
      • Why do I see confidence value as -0.1.?
      • How to use the OSS version of the TensorRT plugins in DeepStream?
      • Why do I see the below Error while processing H265 RTSP stream?
      • Why do I observe a lot of buffers being dropped when running deepstream-nvdsanalytics-test application on Jetson Nano ?
      • Why do I observe: A lot of buffers are being dropped. When running live camera streams even for few or single stream, also output looks jittery?
      • Why does the RTSP source used in gst-launch pipeline through uridecodebin show blank screen followed by the error - WARNING: from element /GstPipeline:pipeline0/GstNvStreamMux:m: No Sources found at the input of muxer. Waiting for sources?
      • What if I do not get expected 30 FPS from camera using v4l2src plugin in pipeline but instead get 15 FPS or less than 30 FPS?
      • On Jetson platform, I get same output when multiple Jpeg images are fed to nvv4l2decoder using multifilesrc plugin. Why is that?
      • How do I obtain individual sources after batched inferencing/processing? What are the sample pipelines for nvstreamdemux?
      • Why do I encounter such error while running Deepstream pipeline memory type configured and i/p buffer mismatch ip_surf 0 muxer 3?
      • How does secondary GIE crop and resize objects?
      • How to save frames from GstBuffer?
      • What are different Memory types supported on Jetson and dGPU?
      • What are different Memory transformations supported on Jetson and dGPU?
      • Why does my image look distorted if I wrap my cudaMalloc’ed memory into NvBufSurface and provide to NvBufSurfTransform?
      • Why am I getting following waring when running deepstream app for first time?
      • How to find out the maximum number of streams supported on given platform?
      • How to find the performance bottleneck in DeepStream?
      • How to fix “cannot allocate memory in static TLS block” error?
    • Smart Record
      • Does smart record module work with local video streams?
      • Are multiple parallel records on same source supported?
      • What if I forgot to stop the recording?
      • I started the record with a set duration. Can I stop it before that duration ends?
      • What if I don’t set default duration for smart record?
      • What if I don’t set video cache size for smart record?
      • What is maximum duration of data I can cache as history for smart record?
      • Can I record the video with bounding boxes and other information overlaid?
    • Triton
      • Which Triton version is supported in DeepStream 6.0 release?
      • Can Jetson platform support the same features as dGPU for Triton plugin?
      • Can Gst-nvinfereserver (DeepSream Triton plugin) run on Nano platform?
      • How to enable TensorRT optimization for Tensorflow and ONNX models?
      • How to tune GPU memory for Tensorflow models?
      • Can Gst-nvinferserver support models cross processes or containers?
      • Can users set different model repos when running multiple Triton models in single process?
      • What is the difference between DeepStream classification and Triton classification?
      • Why is max_batch_size: 0 used in some Triton model config files (samples/triton_model_repo/*/config.pbtxt)?
      • How to support Triton ensemble model?
      • Does Gst-nvinferserver support Triton multiple instance groups?
      • Can Gst-nvinferserver support inference on multiple GPUs?
      • What is batch-size differences for a single model in different config files (gie group in source, config_inferserver.., and Triton model’s config.pbtxt)?
      • How to use NVTX for profiling?

Graph Composer

  • Introduction
    • Graph Composer Ecosystem
    • Contents of the release
  • Graph Specification
    • Concepts
      • Graph
      • SubGraph
      • Node
      • Components
      • Edges
      • Extension
    • Graph File Format
  • System Requirements
  • Installation
  • Application Workflow
    • Launch Composer
    • Sync Extensions
    • Load Graph
    • Deploy Graph
  • Creating an AI Application
    • Launch Composer
    • Drag and Drop Components
    • Configure Components
    • Connect Components
    • Add GStreamer Scheduler
    • Count Number Of People
    • Save Graph
    • Use Multiple inputs
    • Runtime add/remove inputs
  • Reference graphs
    • Installing the reference graphs
    • deepstream-test1
      • Graph Files
      • Graph
      • Sample Output
    • deepstream-test2
      • Graph Files
      • Graph
      • Sample Output
    • deepstream-test3
      • Graph Files
      • Graph
      • Sample Output
    • deepstream-test4
      • Graph Files
      • Graph
      • Sample Output
    • deepstream-test5
      • Graph and related files
      • Graph
      • Sample Output
    • deepstream-runtime-src-add-del
      • Graph Files
      • Graph
      • Sample Output
    • deepstream-template-plugin
      • Graph Files
      • Graph
      • Sample Output
    • deepstream-app
      • Graph Files
    • deepstream-audio
      • Graph files
      • Graph
      • Sample Output
    • deepstream-360d
      • Graph and related files
      • Graph
      • Sample Output
    • deepstream-triton
      • Graph and related files
      • Graph
      • Sample Output
    • deepstream-camera
      • Graph and related files
      • Graph
      • Sample Output
    • deepstream-action-recognition
      • Graph and related files
      • Graph
      • Sample Output
    • deepstream-subgraph
      • Graph and related files
      • Graph
      • Sample Output
  • Development Workflow
    • Sync Extensions
      • Using command-line
      • Using Composer UI
    • Develop New Extension
      • Using command-line
        • Generating a non-DeepStream (GStreamer) extension
        • Generating a DeepStream (GStreamer) extension
      • Using Composer UI
    • Create and test graph
  • Developing Extensions for DeepStream
    • Extension and component factory registration boilerplate
    • A simple DeepStream component
    • Implementation of INvDsInPlaceDataHandler
    • Controlling Properties
    • Triggering Actions
    • Handling signal callbacks
    • Implementation of an Configuration Provider component
  • DeepStream Components
    • Interfaces
      • Element - INvDsElement
      • I/Os - INvDsIO/INvDsInput/INvDsOutput
      • Connections - INvDsConnection
      • DeepStream Domain Component - INvDsComponent
      • Probe - INvDsProbe
      • Probe Callback Implementation - INvDsInPlaceDataHandler
      • Action - INvDsAction
      • Signal - INvDsSignal
      • Element Property Controller – INvDsPropertyController
      • Configurations – INvDsConfigComponent template and specializations
      • INvDsInferModelConfigComponent
      • INvDsVideoTemplatePluginConfigComponent / INvDsAudioTemplatePluginConfigComponent
    • Data Components
      • GstBufferHandle
      • NvBufSurfaceHandle
      • NvBufAudioHandle
      • NvDsBatchMetaHandle
    • Basic Components
      • I/Os
    • Connections
      • NvDsConnection
      • NvDsMultiSrcConnection
    • Probes
      • NvDsProbe
      • NvDsProbeConnector
      • NvDsScheduler
  • Registry
    • Repository Manager
      • NVIDIA Cloud Repository
      • Local Workspace
    • Cache
    • Extension Registration
    • Installing graphs for deployment
  • Registry Command Line Interface
    • cache
    • repo
      • repo clean
      • repo list
      • repo info
      • repo sync
    • extn
      • extn add
      • extn sync
      • extn import
      • extn list
      • extn info
      • extn versions
      • extn variants
      • extn dependencies
    • comp
      • comp list
      • comp info
    • graph
      • graph install
  • Composer
    • User Interface
      • Menu Bar
      • Toolbar
      • Component List
    • Create New Application
    • Open and Save Application Graphs
    • Compose an Application Graph
      • Finding the right component
      • Creating a Component Instance
      • Understanding the Component Handles
        • Linking and Unlinking components
      • Node
      • Setting up a Connection from an Input to an Output
      • Changing the Component Properties
      • Editor Features
      • Subgraph
    • Run Graph
      • Options
    • Build Container Image
      • Options
    • Generate Gstreamer Extension
      • Options
    • Restrictions
  • Graph Execution Engine
  • Container Builder
    • Prerequisites
    • Container Builder Features
    • Container Builder Tool Usage
    • Run Container Builder
    • Container Builder Configuration
    • A Basic Example of Container Builder Configuration
    • A Multi-Stage Example
    • Container builder main control section specification
    • Container dockerfile stage section specification

Extensions Manual

  • NvDsAnalyticsExt
    • Components
      • nvidia::deepstream::NvDsAnalytics
        • Parameters
  • NvDsBaseExt
    • Interfaces
      • nvidia::deepstream::INvDsKeyboardInput
    • Components
      • nvidia::deepstream::NvDsStaticOutput
      • nvidia::deepstream::NvDsDynamicOutput
      • nvidia::deepstream::NvDsOnRequestOutput
      • nvidia::deepstream::NvDsStaticInput
      • nvidia::deepstream::NvDsOnRequestInput
      • nvidia::deepstream::NvDsMultiOutput
      • nvidia::deepstream::NvDsProbeConnector
      • nvidia::deepstream::NvDsProbe
        • Parameters
      • nvidia::deepstream::NvDsConnection
        • Parameters
      • nvidia::deepstream::NvDsMultiSrcConnection
        • Parameters
      • nvidia::deepstream::NvDsKeyboardInput
      • nvidia::deepstream::NvDsScheduler
      • nvidia::deepstream::NvDsToGxfBridge
        • Parameters
      • nvidia::deepstream::NvGxfToDsBridge
        • Parameters
      • nvidia::deepstream::NvDsGxfObjectDataTranslator
      • nvidia::deepstream::NvDsGxfAudioClassificationDataTranslator
      • nvidia::deepstream::NvDsGxfOpticalFlowDataTranslator
      • nvidia::deepstream::NvDsGxfSegmentationDataTranslator
      • nvidia::deepstream::NvDsGxfInferTensorDataTranslator
      • nvidia::deepstream::NvDsQueue
        • Parameters
      • nvidia::deepstream::NvDsTee
        • Parameters
      • nvidia::deepstream::NvDsBufferSync
        • Parameters
  • NvDsBodyPose2D
    • Components
      • nvidia::BodyPose2D::PostProcess
        • Parameters
      • nvidia::BodyPose2D::BodyPose2dModel
        • Parameters
      • nvidia::BodyPose2D::NvDsGxfBodypose2dDataTranslator
  • NvDsCloudMsgExt
    • Components
      • nvidia::deepstream::NvDsMessage
      • nvidia::deepstream::NvDsMsgRelayTransmitter
      • nvidia::deepstream::NvDsMsgRelayReceiver
      • nvidia::deepstream::NvDsMsgBrokerC2DReceiver
        • Parameters
      • nvidia::deepstream::NvDsMsgBrokerD2CTransmitter
        • Parameters
      • nvidia::deepstream::NvDsMsgRelay
        • Parameters
      • nvidia::deepstream::NvDsMsgBroker
        • Parameters
      • nvidia::deepstream::NvDsMsgConverter
        • Parameters
      • nvidia::deepstream::NvDsMsgConvBroker
        • Parameters
  • NvDsConverterExt
    • Components
      • nvidia::deepstream::NvDsVideoConvert
        • Parameters
      • nvidia::deepstream::AudioConvert
        • Parameters
      • nvidia::deepstream::AudioResample
        • Parameters
  • NvDsDewarperExt
    • Components
      • nvidia::deepstream::NvDsAisleFilter
        • Parameters
      • nvidia::deepstream::NvDsBBoxFilter
        • Parameters
      • nvidia::deepstream::NvDsDewarper
        • Parameters
      • nvidia::deepstream::NvDsSpotChangeSignal
      • nvidia::deepstream::NvDsSpot
        • Parameters
  • NvDsEmotionExt
    • Components
      • nvidia::Emotion::NvDsEmotionTemplateLib
  • NvDsFacialLandmarks
    • Components
      • nvidia::FacialLandmarks::PostProcess
        • Parameters
      • nvidia::FacialLandmarks::PreProcess
        • Parameters
      • nvidia::FacialLandmarks::FacialLandmarksPgieModel
        • Parameters
      • nvidia::FacialLandmarks::FacialLandmarksSgieModel
        • Parameters
      • nvidia::FacialLandmarks::FacialLandmarksSgieModelV2
        • Parameters
      • nvidia::FacialLandmarks::NvDsGxfFacialLandmarksTranslator
  • NvDsGesture
    • Components
      • nvidia::Gesture::PreProcess
        • Parameters
      • nvidia::Gesture::GestureModel
        • Parameters
  • NvDsGazeExt
    • Components
      • nvidia::Gaze::NvDsGazeTemplateLib
      • nvidia::Gaze::NvDsGxfGazeDataTranslator
  • NvDsHeartRateExt
    • Components
      • nvidia::HeartRate::NvDsHeartRateTemplateLib
      • nvidia::HeartRate::NvDsGxfHeartRateDataTranslator
  • NvDsInferenceExt
    • Components
      • nvidia::deepstream::NvDsModelUpdatedSignal
      • nvidia::deepstream::NvDsInferVideoPropertyController
      • nvidia::deepstream::NvDsInferVideo
        • Parameters
      • nvidia::deepstream::NvDsAsr
        • Parameters
      • nvidia::deepstream::NvDsInferAudio
        • Parameters
      • nvidia::deepstream::NvDsPreProcess
        • Parameters
  • NvDsInferenceUtilsExt
    • Components
      • nvidia::deepstream::NvDsKittiDump
        • Parameters
      • nvidia::deepstream::NvDsFpsMeasurement
        • Parameters
      • nvidia::deepstream::NvDsLatencyMeasurement
        • Parameters
      • nvidia::deepstream::NvDsAudioClassificationPrint
        • Parameters
      • nvidia::deepstream::NvDsPerClassObjectCounting
        • Parameters
      • nvidia::deepstream::NvDsModelEngineWatchOTFTrigger
        • Parameters
      • nvidia::deepstream::NvDsRoiClassificationResultParse
        • Parameters
  • NvDsInterfaceExt
    • Interfaces
      • nvidia::deepstream::INvDsElement
      • nvidia::deepstream::INvDsIO
      • nvidia::deepstream::INvDsInput
      • nvidia::deepstream::INvDsOutput
      • nvidia::deepstream::INvDsProbe
      • nvidia::deepstream::INvDsConnection
      • nvidia::deepstream::INvDsComponent
      • nvidia::deepstream::INvDsInPlaceDataHandler
      • nvidia::deepstream::INvDsAction
      • nvidia::deepstream::INvDsSignal
      • nvidia::deepstream::INvDsPropertyController
      • nvidia::deepstream::INvDsAudioTemplatePluginConfigComponent
      • nvidia::deepstream::INvDsVideoTemplatePluginConfigComponent
      • nvidia::deepstream::INvDsInferModelConfigComponent
      • nvidia::deepstream::INvDsGxfDataTranslator
    • Components
      • nvidia::deepstream::NvBufSurfaceHandle
      • nvidia::deepstream::NvBufAudioHandle
      • nvidia::deepstream::NvDsBatchMetaHandle
      • nvidia::deepstream::GstBufferHandle
  • NvDsMuxDemuxExt
    • Components
      • nvidia::deepstream::NvDsStreamDemux
        • Parameters
      • nvidia::deepstream::NvDsStreamMux
        • Parameters
      • nvidia::deepstream::NvDsStreamDemuxNew
        • Parameters
      • nvidia::deepstream::NvDsStreamMuxNew
        • Parameters
  • NvDsOpticalFlowExt
    • Components
      • nvidia::deepstream::NvDsOpticalFlow
        • Parameters
      • nvidia::deepstream::NvDsOpticalFlowVisual
        • Parameters
  • NvDsOutputSinkExt
    • Components
      • nvidia::deepstream::NvDsFakeSink
        • Parameters
      • nvidia::deepstream::NvDsFileOut
        • Parameters
      • nvidia::deepstream::NvDsVideoRendererPropertyController
      • nvidia::deepstream::NvDsVideoRenderer
        • Parameters
      • nvidia::deepstream::NvDsRtspOut
        • Parameters
      • nvidia::deepstream::XvImageSink
        • Parameters
      • nvidia::deepstream::AlsaAudioRenderer
        • Parameters
  • NvDsSampleExt
    • Components
      • nvidia::deepstream::NvDsSampleProbeMessageMetaCreation
        • Parameters
      • nvidia::deepstream::NvDsSampleSourceManipulator
        • Parameters
      • nvidia::deepstream::NvDsSampleVideoTemplateLib
        • Parameters
      • nvidia::deepstream::NvDsSampleAudioTemplateLib
        • Parameters
      • nvidia::deepstream::NvDsSampleC2DSmartRecordTrigger
        • Parameters
      • nvidia::deepstream::NvDsSampleD2C_SRMsgGenerator
        • Parameters
  • NvDsSampleModelsExt
    • Components
      • nvidia::deepstream::NvDsResnet10_4ClassDetectorModel
        • Parameters
      • nvidia::deepstream::NvDsSecondaryCarColorClassifierModel
        • Parameters
      • nvidia::deepstream::NvDsSecondaryCarMakeClassifierModel
        • Parameters
      • nvidia::deepstream::NvDsSecondaryVehicleTypeClassifierModel
        • Parameters
      • nvidia::deepstream::NvDsSonyCAudioClassifierModel
        • Parameters
      • nvidia::deepstream::NvDsCarDetector360dModel
        • Parameters
  • NvDsSourceExt
    • Components
      • nvidia::deepstream::NvDsSourceInfoLoader
        • Parameters
      • nvidia::deepstream::NvDsStartSrAction
      • nvidia::deepstream::NvDsStopSrAction
      • nvidia::deepstream::NvDsSrDoneSignal
      • nvidia::deepstream::NvDsSingleSrcInput
        • Parameters
      • nvidia::deepstream::NvDsSourceManipulationAction
      • nvidia::deepstream::NvDsMultiSourceSmartRecordAction
      • nvidia::deepstream::NvDsMultiSrcInput
        • Parameters
      • nvidia::deepstream::NvDsMultiSrcWarpedInput
        • Parameters
      • nvidia::deepstream::NvDsRecordAction
      • nvidia::deepstream::NvDsMultiSrcInputWithRecord
        • Parameters
      • nvidia::deepstream::NvDsCameraSrcInput
        • Parameters
      • nvidia::deepstream::VideoTestSrc
        • Parameters
      • nvidia::deepstream::AudioTestSrc
        • Parameters
      • nvidia::deepstream::AlsaAudioInput
        • Parameters
  • NvDsTemplateExt
    • Components
      • nvidia::deepstream::NvDsAudioTemplate
        • Parameters
      • nvidia::deepstream::NvDsVideoTemplate
        • Parameters
  • NvDsTrackerExt
    • Components
      • nvidia::deepstream::NvDsTracker
        • Parameters
  • NvDsTranscodeExt
    • Components
      • nvidia::deepstream::NvDsJpegDecoder
        • Parameters
      • nvidia::deepstream::NvDsVideoDecoder
        • Parameters
      • nvidia::deepstream::NvDsH264Encoder
        • Parameters
      • nvidia::deepstream::NvDsH265Encoder
        • Parameters
  • NvDsTritonExt
    • Components
      • nvidia::deepstream::NvDsTriton
        • Parameters
  • NvDsVisualizationExt
    • Components
      • nvidia::deepstream::NvDsOSDPropertyController
      • nvidia::deepstream::NvDsOSD
        • Parameters
      • nvidia::deepstream::NvDsTiler
        • Parameters
      • nvidia::deepstream::NvDsSegVisual
        • Parameters
      • nvidia::deepstream::NvDsBlender
        • Parameters
      • nvidia::deepstream::NvDsTilerEventHandler
        • Parameters

DeepStream API Guide

  • DeepStream API Guides

DeepStream Legal Information

  • DeepStream Legal Information
    • Notice
    • Trademarks
    • Copyright

DeepStream Feedback

  • Feedback form
    • Request Documentation Fix
DeepStream
  • Docs »
  • DeepStream Legal Information
  • View page source

DeepStream Legal Information¶

Notice¶

ALL NVIDIA DESIGN SPECIFICATIONS, REFERENCE BOARDS, FILES, DRAWINGS, DIAGNOSTICS, LISTS, AND OTHER DOCUMENTS (TOGETHER AND SEPARATELY, ”MATERIALS”) ARE BEING PROVIDED ”AS IS.” NVIDIA MAKES NO WARRANTIES, EXPRESS, IMPLIED, STATUTORY, OR OTHERWISE WITH RESPECT TO THE MATERIALS, AND ALL EXPRESS OR IMPLIED CONDITIONS, REPRESENTATIONS AND WARRANTIES, INCLUDING ANY IMPLIED WARRANTY OR CONDITION OF TITLE, MERCHANTABILITY, SATISFACTORY QUALITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT, ARE HEREBY EXCLUDED TO THE MAXIMUM EXTENT PERMITTED BY LAW.

Information furnished is believed to be accurate and reliable. However, NVIDIA Corporation assumes no responsibility for the consequences of use of such information or for any infringement of patents or other rights of third parties that may result from its use. No license is granted by implication or otherwise under any patent or patent rights of NVIDIA Corporation. Specifications mentioned in this publication are subject to change without notice. This publication supersedes and replaces all information previously supplied. NVIDIA Corporation products are not authorized for use as critical components in life support devices or systems without express written approval of NVIDIA Corporation.

Trademarks¶

NVIDIA, the NVIDIA logo, CUDA, Jetson, Jetson Nano, NVIDIA AGX, Tegra, TensorRT, Tesla and Xavier are trademarks or registered trademarks of NVIDIA Corporation in the United States and other countries. Other company and product names may be trademarks of the respective companies with which they are associated. The Android robot is reproduced or modified from work created and shared by Google and is used according to terms described in the Creative Commons 3.0 Attribution License. HDMI, the HDMI logo, and High-Definition Multimedia Interface are trademarks or registered trademarks of HDMI Licensing LLC. ARM, AMBA, and ARM Powered are registered trademarks of ARM Limited. Cortex, MPCore and Mali are trademarks of ARM Limited. All other brands or product names are the property of their respective holders. ”ARM” is used to represent ARM Holdings plc; its operating company ARM Limited; and the regional subsidiaries ARM Inc.; ARM KK; ARM Korea Limited.; ARM Taiwan Limited; ARM France SAS; ARM Consulting (Shanghai) Co. Ltd.; ARM Germany GmbH; ARM Embedded Technologies Pvt. Ltd.; ARM Norway, AS and ARM Sweden AB.

Copyright¶

© 2021 by NVIDIA CORPORATION & AFFILIATES. All rights reserved.

Next Previous

© Copyright 2020-2021, NVIDIA. Last updated on Oct 27, 2021.

Built with Sphinx using a theme provided by Read the Docs.