• Check the GStreamer-1.0 version with the following command:
gst-inspect-1.0 --version
GStreamer-1.0 Plugin Reference
Note
The gst-omx plugin is deprecated in Jetson Linux Driver Package Release 32.1. Use the gst-v4l2 plugin instead.
GStreamer version 1.0 includes the following gst-omx video decoders:
Video Decoder
Description
omxh265dec
OpenMAX IL H.265 Video decoder
omxh264dec
OpenMAX IL H.264 Video decoder
omxmpeg4videodec
OpenMAX IL MPEG4 Video decoder
omxmpeg2videodec
OpenMAX IL MPEG2 Video decoder
omxvp8dec
OpenMAX IL VP8 Video decoder
omxvp9dec
OpenMAX IL VP9 video decoder
GStreamer version 1.0 includes the following gst-v4l2 video decoders:
Video Encoders
Description
nvv4l2decoder
V4L2 H.265 Video decoder
V4L2 H.264 Video decoder
V4L2 VP8 video decoder
V4L2 VP9 video decoder
V4L2 MPEG4 video decoder
V4L2 MPEG2 video decoder
GStreamer version 1.0 includes the following gst-omx video encoders:
Video Encoders
Description
omxh264enc
OpenMAX IL H.264/AVC video encoder
omxh265enc
OpenMAX IL H.265/AVC video encoder
omxvp8enc
OpenMAX IL VP8 video encoder (supported with NVIDIA® Jetson™ TX2/TX2i and NVIDIA® Jetson Nano™; not supported with NVIDIA® Jetson AGX Xavier™)
omxvp9enc
OpenMAX IL VP9 video encoder (supported with Jetson TX2 and Jetson AGX Xavier; not supported with Jetson Nano)
GStreamer version 1.0 includes the following gst-v4l2 video encoders:
Video Encoders
Description
nvv4l2h264enc
V4l2 H.264 video encoder
nvv4l2h265enc
V4l2 H.265 video encoder
nvv4l2vp8enc
V4l2 VP8 video encoder (supported with Jetson TX2/TX2i and Jetson Nano; not supported with Jetson AGX Xavier)
nvv4l2vp9enc
V4l2 VP9 video encoder (supported with Jetson AGX Xavier and Jetson; not supported with Jetson Nano)
GStreamer version 1.0 includes the following gst-omx video sink:
Video Sink
Description
nvoverlaysink
OpenMAX IL videosink element
GStreamer version 1.0 includes the following EGL image video sink:
Video Sink
Description
nveglglessink
EGL/GLES videosink element, both the X11 and Wayland backends
nv3dsink
EGL/GLES videosink element
GStreamer version 1.0 includes the following DRM video sink:
Video Sink
Description
nvdrmvideosink
DRM videosink element
Note
The nvoverlaysink plugin is deprecated in Jetson Linux Driver Package Release 32.1. Use nvdrmvideosink and nv3dsink instead for render pipelines with gst-v4l2 decoder.
GStreamer version 1.0 includes the following proprietary NVIDIA plugins:
NVIDIA Proprietary Plugin
Description
nvarguscamerasrc
Camera plugin for ARGUS API
nvvidconv
Video format conversion & scaling
nvcompositor
Video compositor
nveglstreamsrc
Acts as GStreamer Source Component, accepts EGLStream from EGLStream producer
nvvideosink
Video Sink Component. Accepts YUV-I420 format and produces EGLStream (RGBA)
nvegltransform
Video transform element for NVMM to EGLimage (supported with nveglglessink only)
GStreamer version 1.0 includes the following libjpeg based JPEG image video encode/decode plugins:
JPEG
Description
nvjpegenc
JPEG encoder element
nvjpegdec
JPEG decoder element
Note
Execute this command on the target before starting the video decode pipeline using gst-launch or nvgstplayer.
export DISPLAY=:0
Start the X server with xinit &, if it is not already running.
Decode Examples
The examples in this section show how you can perform audio and video decode with GStreamer.
Note
GStreamer version 0.10 support is deprecated in Jetson Linux Driver Package Release 24.2. Use of GStreamer version 1.0 is recommended for development.
Audio Decode Examples Using gst-launch-1.0
The following examples show how you can perform audio decode using GStreamer-1.0.
For decode use cases with low memory allocation requirements (e.g. on Jetson Nano), use the enable-low-outbuffer property of the gst-omx decoder plugin.
Supported H.264/H.265/VP8/VP9 Encoder Features with GStreamer-1.0
This section describes example gst-launch-1.0 usage for features supported by the NVIDIA accelerated H.264/H.265/VP8/VP9 encoders.
Features Supported Using gst-omx
This section describes example gst-launch-1.0 usage for features supported by the NVIDIA accelerated H.264/H.265/VP8/VP9 gst-omx encoders.
Note
Display detailed information on omxh264enc or omxh265enc encoder properties with the gst-inspect-1.0 [omxh264enc | omxh265enc | omxvp8enc | omxvp9enc] command.
Only Integer Pixel (integer-pel) block motion is estimated. For I/P macroblock mode decision, only Intra 16 x 16 cost is compared with Inter modes costs. Supports Intra 16 x 16 and Intra 4 x 4 modes.
2
MediumPreset
Supports up to Half Pixel (half-pel) block motion estimation. For an I/P macroblock mode decision, only Intra 16 x 16 cost is compared with Inter modes costs. Supports Intra 16 x 16 and Intra 4 x 4 modes.
3
SlowPreset
Supports up to Quarter Pixel (Qpel) block motion estimation. For an I/P macroblock mode decision, Intra 4 x 4 as well as Intra 16 x 16 cost is compared with Inter modes costs. Supports Intra 16 x 16 and Intra 4 x 4 modes.
From omxh264enc, the following levels are supported: 1, 1b, 1.2, 1.3, 2, 2.1, 2.2, 3, 3.1, 3.2, 4, 4.1, 4.2, 5, 5.1, and 5.2.
From omxh265enc, the following levels are supported: main1, main2, main2.1, main3, main3.1, main4, main4.1, main5, high1, high2, high2.1, high3, high3.1, high4, high4.1, and high5.
Set Number of B Frames Between Two Reference Frames
If the buffer size of decoder or network bandwidth is limited, configuring virtual buffer size can cause video stream generation to correspond to the limitations according to the following formula:
The parameter bit-packetization=0 configures the network abstraction layer (NAL) packet as macroblock (MB)-based, and slice-header-spacing=200 configures each NAL packet as 200 MB at maximum.
Slice Header Spacing with Spacing in Terms of Number of Bits
The parameter bit-packetization=1 configures the network abstraction layer (NAL) packet as size-based, and slice-header-spacing=1024 configures each NAL packet as 1024 bytes at maximum.
Features Supported Using gst-v4l2
This section describes example gst-launch-1.0 usage for features supported by the NVIDIA accelerated H.264/H.265/VP8/VP9 gst-v4l2 encoders.
Note
Display detailed information on the nvv4l2h264enc, nvv4l2h265enc, or nvv4l2vp9enc encoder property with the gst-inspect-1.0 [nvv4l2h264enc | nvv4l2h265enc | nvv4l2vp8enc | nvv4l2vp9enc] command.
Set I-Frame Interval (Supported with H.264/H.265/VP9 Encode)
Only Integer Pixel (integer-pel) block motion is estimated. For I/P macroblock mode decisions, only Intra 16×16 cost is compared with intermode costs. Supports intra 16×16 and intra 4×4 modes.
3
MediumPreset
Supports up to Half Pixel (half-pel) block motion estimation. For I/P macroblock mode decisions, only Intra 16×16 cost is compared with intermode costs. Supports intra 16×16 and intra 4×4 modes.
4
SlowPreset
Supports up to Quarter Pixel (Qpel) block motion estimation. For I/P macroblock mode decisions, intra 4×4 as well as intra 16×16 cost is compared with intermode costs. Supports intra 16×16 and intra 4×4 modes.
Two-pass CBR must be enabled along with constant bit rate (control-rate=1).
Note
For multi-instance encode with two-pass CBR enabled, enable max perf mode by using the maxperf-enable property of the gst-v4l2 encoder to achieve best performance. Expect increased power consumption in max perf mode.
Slice-Header-Spacing with Spacing in Terms of MB (Supported with H.264/H.265 Encode)
The parameter bit-packetization=0 configures the network abstraction layer (NAL) packet as macroblock (MB)-based, and slice-header-spacing=8 configures each NAL packet as 8 MB at maximum.
Slice Header Spacing with Spacing in Terms of Number of Bits (Supported with H.264/H.265 Encode)
The parameter bit-packetization=1 configures the network abstraction layer (NAL) packet as size-based, and slice-header-spacing=1400 configures each NAL packet as 1400 bytes at maximum.
Enable Cabac-Entropy-Coding (Supported with H.264 Encode for Main or High Profile)
This property sets the number of B frames between two reference frames.
Note
For multi-instance encode with num-B-Frames=2, enable max perf mode by specifying the maxperf-enable property of the gst-v4l2 encoder for best performance. Expect increased power consumption in max perf mode.
For multi-instance encode with num-B-Frames=2, enable max perf mode by specifying the maxperf-enable property of the gst-v4l2 encoder for best performance. Expect increased power consumption in max perf mode.
The OpenCV sample application opencv_nvgstcam simulates the camera capture pipeline. Similarly, the OpenCV sample application opencv_nvgstenc simulates the video encode pipeline.
Both sample applications are based on GStreamer 1.0. They currently are supported only by OpenCV version 3.3.
opencv_nvgstcam: Camera Capture and Preview
To simulate the camera capture pipeline with the opencv_nvgstcam sample application, enter this command:
./opencv_nvgstcam --help
Note
The opencv_nvgstcam application as distributed currently supports only single-instance CSI capture using the d plugin. You can modify and rebuild the application to support GStreamer pipelines for CSI multi-instance capture and USB camera capture using the v4l2src plugin. The application uses an OpenCV-based videosink for display.
For camera CSI capture and preview rendering with OpenCV, enter this command:
To simulate the camera capture and video encode pipeline with the opencv_nvgstenc sample application, enter this command:
./opencv_nvgstenc --help
Note
The opencv_nvgstenc application as distributed currently supports only camera CSI capture using the nvarguscamerasrc plugin and video encode in H.264 format using the nvv4l2h264enc plugin with an MP4 container file. You can modify and rebuild the application to support GStreamer pipelines for different video encoding formats. The application uses an OpenCV-based videosink for display.
For camera CSI capture and video encode with OpenCV enter this command:
For nvgstplayer-1.0 usage information enter the following command:
nvgstplayer-1.0 --help
Video can be output to HD displays using the HDMI connector on the platform. The GStreamer-1.0 application supports currently the following video sinks:
Note
The nvoverlaysink plugin is deprecated in Jetson Linux Driver Package Release 32.1. Use the nvdrmvideosink plugin for development.
Overlay Sink (Video playback on overlay in full-screen mode)
The following examples show how you can perform video playback using GStreamer-1.0.
Overlay Sink (Video playback using overlay parameters)
Note:
The following steps are required to use the “overlay” property on Jetson-TX2.
Set win_mask with the following commands:
# sudo -s # cd /sys/class/graphics/fb0 # echo 4 > blank // Blanks monitor for changing # // display setting. # echo 0x0 > device/win_mask # // Clears current window setting. # // window setting. # echo 0x3f > device/win_mask # // Assigns all 6 overlay windows # // in display controller to # // display 0 (fb0). # echo 0 > blank // Unblank display.
Stop X11 using following command:
$ sudo systemctl stop gdm
$ sudo loginctl terminate-seat seat0
For more introduction about the overlay windows in the display controller, please refer to the Tegra X2 Technical Reference Manual (TRM).
To use all six overlays X11 must be disabled, since it occupies one window. Disabling X11 also helps avoid memory bandwidth contention when using a non-X11 overlay.
nveglglessink (Windowed video playback, NVIDIA EGL/GLES videosink using Wayland backend)
You can also use nveglglsink with the Wayland backend, instead of the default X11 backend.
Ubuntu 16.04 does not support the Wayland display server. That is, there is no UI support to switch to Wayland from Xorg. You must start the Wayland server (Weston) using the target’s shell before performing any Weston based operation.
To start Weston:
The following steps are required before you first run the GStreamer pipeline with the Wayland back end. They are not required on subsequent runs.
nv3dsink Video Sink (Video playback using 3D graphics API)
This video sink element works with NVMM buffers and renders using the 3D graphics rendering API. It performs better than nveglglessink with NVMM buffers.
The following command starts the gstreamer pipeline using nv3dsink:
You can simulate a video decode pipeline using the GStreamer-1.0 based OpenCV sample application opencv_nvgstdec.
Note:
The sample application currently operates only with OpenCV version 3.3.
To perform video decoding with opencv_nvgstdec, enter the following command:
./opencv_nvgstdec --help
Note:
The opencv_nvgstdec application as distributed current supports only video decode of H264 format using the nvv4l2decoder plugin. You can modify and rebuild the application to support GStreamer pipelines for video decode of different formats. For display, the application utilizes an OpenCV based videosink component.
For perform video decoding with opencv_nvgstdec, enter the command:
./opencv_nvgstdec --file-path=test_file_h264.mp4
Video Streaming with GStreamer-1.0
This section describes procedures for video streaming with GStreamer‑1.0.
The NVIDIA proprietary nvvidconv GStreamer-1.0 plugin allows conversion between OSS (raw) video formats and NVIDIA video formats. The nvvidconv plugin currently supports the format conversions described in this section
raw-yuv Input Formats
Currently nvvidconv supports the I420, UYVY, YUY2, YVYU, NV12, GRAY8, BGRx, and RGBA raw-yuv input formats.
The NVIDIA proprietary nvvidconv GStreamer-1.0 plugin also allows you to perform video scaling. The nvvidconv plugin currently supports scaling with the format conversions described in this section.
raw-yuv Input Formats
Currently nvvidconv supports the I420, UYVY, YUY2, YVYU, NV12, GRAY8, BGRx, and RGBA raw-yuv input formats for scaling.
This NVIDIA proprietary GStreamer-1.0 plugin performs pre/post and CUDA post-processing operations on CSI camera captured or decoded frames, and renders video using overlay video sink or video encode.
Note
The gst-nvivafilter pipeline requires unsetting the DISPLAY environment variable using the command unset DISPLAY if lightdm is stopped.
See the nvsample_cudaprocess_src.tbz2 package for the libnvsample_cudaprocess.so library sources. A sample CUDA implementation of libnvsample_cudaprocess.so can be replaced by a custom CUDA implementation.
Video Rotation with GStreamer-1.0
The NVIDIA proprietary nvvidconv GStreamer-1.0 plugin also allows you to perform video rotation operations.
The following table shows the supported values for the nvvidconvflip‑method property.
Flip Method
Property value
Identity - no rotation (default)
0
Counterclockwise - 90 degrees
1
Rotate - 180 degrees
2
Clockwise - 90 degrees
3
Horizontal flip
4
Upper right diagonal flip
5
Vertical flip
6
Upper-left diagonal
7
Note
Get information on the nvvidconv flip-method property with the gst-inspect-1.0 nvvidconv command.
To rotate video 90 degrees counterclockwise
To rotate video 90 degrees in a counterclockwise direction, enter the following command.
This release contains the gst-install script to install a specific GStreamer version. This section provides a procedure for building current versions of GStreamer.
This section describes the nvgstcapture-1.0 application.
Note:
By default, the nvgstcapture-1.0 application only supports the ARGUS API using the nvarguscamerasrc plugin. The legacy nvcamerasrc plugin is no longer supported.
Command Line Options
The nvgstcapture-1.0 application can display information about its own command line options. To display command line option information, run the application with one of these command line options:
“Help” Command Line Options
Option
Description
-h
--help
Show help options except for GStreamer options.
--help-all
Show all help options.
--help-gst
Show GStreamer options.
This table describes the application’s other command-line options.
Other Command Line Options
Option
Description
Notes Examples
--prev_res
Preview width and height.
Range: 2 to 8 (3840x2160)
--prev_res=3
--cus-prev-res
Custom preview width and height for CSI only.
--cus-prev-res=1920x1080
--image_res
Image width and height,
Range: 2 to 12 (5632x4224)
--image_res=3
--video-res
Video width and height.
Range: 2 to 9 (3896x2192)
--video-res=3
--camsrc
Camera source to use
0=V4L2 1=csi[default] 2=videotest 3=eglstream
-m, --mode
Capture mode.
1-Still 2-Video
-v, --video_enc
Video encoder type.
0=h264[HW] 1=vp8[HW, not supported on Jetson AGX Xavier] 2=h265[HW] 3=vp9[HW]
-p, --hw-enc-path
Framework Type.
0=OMX
1=V4L2
-b, --enc-bitrate
Video encoding Bit-rate (in bytes)
--enc-bitrate=4000000
--enc-controlrate
Video encoding bit-rate control method.
0 = Disable
1 = variable (Default)
2 = constant
--enc-controlrate=1
--enc-EnableTwopassCBR
Enable two pass CBR while encoding.
0 = Disable
1 = Enable
--enc-EnableTwopassCBR=1
--enc-profile
Video encoder profile (only for H.264).
0-Baseline 1-Main 2-High
-j, --image_enc
Image encoder type.
0-jpeg_SW[jpegenc] 1-jpeg_HW[nvjpegenc]
-k, --file_type
Container file type.
0-MP4 1-3GP 2-MKV
--file-name
Captured file name. nvcamtest is used by default.
--color-format
Color format to use.
0=I420 1=NV12[For CSI only and default for CSI] 2=YUY2[For V4L2 only, default for V4L2]
--orientation
Camera sensor orientation value.
--eglConfig
EGL window Coordinates (x_pos y_pos) in that order.
--eglConfig="50 100"
-w, --whitebalance
Capture whitebalance value.
--timeout
Capture timeout value.
--saturation
Camera saturation value.
--sensor-id
Camera Sensor ID value.
--display-id
[For nvoverlaysink only] Display ID value.
--overlayConfig
Overlay Configuration Options index and coordinates in (index, x_pos, y_pos, width, height) order.
--overlayConfig="0, 0, 0, 1280, 720"
--cap-dev-node
Video capture device node.
0=/dev/video0[default] 1=/dev/video1
2=/dev/video2
--cap-dev-node=0
--svs=<chain>
Where <chain> is a chain of GStreamer elements:
For USB, specifies a chain for video preview.
For CSI only, use nvoverlaysink or nvdrmvideosink.
--exposuretimerange
Property to adjust exposure time range in nanoseconds.
--exposuretimerange="34000 358733000"
--gainrange
Property to adjust gain range
--gainrange="1 16
--ispdigitalgainrange
Property to adjust digital gain range.
--ispdigitalgainrange="1 8" Range value from 1 to 256
--aelock
Enable AE Lock.
Default is disabled.
--awblock
Enable AWB Lock.
Default is disabled.
--exposurecompensation
Property to adjust exposure compensation.
Range value from -2.0 to 2.0.
--exposurecompensation=0.5
--aeantibanding
Property to set the auto exposure antibanding mode.
Range value from 0 to 3.
--aeantibanding=2
--tnr-mode
Property to select temporal noise reduction mode.
--tnr-mode=2
--tnr-strength
Property to adjust temporal noise reduction strength.
--tnr-strength=0.5
--ee-mode
Property to select edge enhancement mode.
--ee-mode=2
--ee-strength
Property to adjust edge enhancement strength.
--ee-strength=0.5
CSI Camera Supported Resolutions
CSI camera supports the following image resolutions for Nvarguscamera:
• 640x480
• 1280x720
• 1920x1080
• 2104x1560
• 2592x1944
• 2616x1472
• 3840x2160
• 3896x2192
• 4208x3120
• 5632x3168
• 5632x4224
CSI Camera Runtime Commands
Options for Nvarguscamera
CSI camera runtime commands options for Nvarguscamera are described in the following table.
Capture after a delay of <delay>, e.g., jx5000 to capture after a 5-second delay
—
j:<value>
Capture <count> number of images in succession, e.g., j:6 to capture 6 images.
—
0
Stop recording video
—
1
Start recording video
—
2
Video snapshot (while recording video)
—
gpcr
Get preview resolution
—
gicr
Get image capture resolution
—
gvcr
Get video capture resolution
—
USB Camera Runtime Commands
USB Camera Runtime Commands
USB camera runtime commands are described in the following table.
Command
Description
Notes
h
Help
—
q
Quit
—
mo:<value>
Set capture mode
1-image 2-video
gmo
Get capture mode
—
j
Capture one image.
—
jx<delay>
Capture after a delay of <delay>, e.g., jx5000 to capture after a 5-second delay
—
j:<value>
Capture <count> number of images in succession, e.g., j:6 to capture 6 images.
—
1
Start recording video
—
0
Stop recording video
—
pcr:<value>
Set preview resolution
0-176x144 1-320x240 2-640x480 3-1280x720
gpcr
Get preview resolution
—
gicr
Get image capture resolution
—
gvcr
Get video capture resolution
—
br:<value>
Set encoding bit rate (in bytes)
e.g., br:4000000
gbr
Get encoding bit rate
—
cdn:<value>
Set capture device node
0-/dev/video0 1-/dev/video1 2-/dev/video2
gcdn
Get capture device node
—
Runtime Video Encoder Configuration Options
The following table describes runtime video encoder configuration options supported for Nvarguscamera.
Command
Description
Notes
br:<val>
Sets encoding bit-rate (in bytes)
Example: br:4000000
gbr
Gets encoding bit-rate (in bytes)
—
ep:<val>
Sets encoding profile (for H.264 only)
Example: ep:1
(0): Baseline
(1): Main
(2): High
gep
Gets encoding profile (for H.264 only)
—
Enter ’f’
Forces IDR frame on video encoder (for H.264 only)
—
Notes
• The nvgstcapture-1.0 application generates image and video output files in the same directory as the application itself.
• Filenames for image and video content are in the formats, respectively:
• nvcamtest_<pid>_<sensor_id>_<counter>.jpg
• nvcamtest_<pid>_<sensor_id>_<counter>.mp4
Where:
• <pid> is the process ID
• <sensor_id> is the sensor ID
• <counter> is a counter starting from 0 every time you run the application
Rename or move files between runs to avoid overwriting results you want to save.
• The nvgstcapture-1.0 application supports native capture(video only) mode by default.
• Advanced features, like setting zoom, brightness, exposure, and whitebalance levels, are not supported for USB camera.
nvgstplayer-1.0 Reference
This section describes the operation of the the nvgstplayer-1.0 application.
nvgstplayer Command Line Options
Note:
To list supported options, enter the command:
nvgstplayer-1.0 --help
This table describes nvgstplayer-1.0 command-line options:
Option
Description and Example
-u <path>
--urifile <path>
Path of the file containing the URIs.
Example: -u my_uri.txt
-I <uri>
--uri <uri>
Input URI.
Examples:
-urifile:///home/ubuntu/movie.avi
-uri http://www.joedoe.com/foo.ogg
-e <path>
--elemfile <path>
Element(s) (Properties) file.
The element file may contain an audio or video processing elements chain like this:
[sas]
pipe=alsasink # device=demixer
-x
--cxpr
Command sequence expression.
Example: -cxpr="r5 s0"
-n <n>
--loop <n>
Number of times to play the media.
-c <n>
--audio-track <n>
If a stream has multiple audio tracks, specifies the track number to play.
-v <n>
--video-track <n>
If a stream has multiple video tracks, specifies the track number to play.
-a <seconds>
--start <seconds>
Point to start playback, in seconds from the beginning of the media segment.
-d <seconds>
--duration <seconds>
Duration of playback, in seconds.
--no-sync
Disable AV sync.
--disable-dpms
Unconditionally disable DPMS/ScreenBlanking during operation; re-enable on exit.
--stealth
Operate in stealth mode, staying alive even when no media is playing.
--bg
Operate in background mode, ignoring keyboard input.
--use-playbin
Use Playbin GStreamer element.
--no-audio
Disable audio.
--no-video
Disable video.
--disable-anative
Disable native audio rendering.
--disable-vnative
Disable native video rendering.
--use-buffering
Enable decodebin property for emit of GST_MESSAGE_BUFFERING based on low and high percent thresholds.
-l <percent>
--low-percent <percent>
Low threshold for buffering to start, in percent.
-j <percent>
--high-percent <percent>
High threshold for buffering to finish, in percent.
--loop-forever
Play the URI(s) in an endless loop.
-t <seconds>
--max-size-time <seconds>
Maximum time in queue, in seconds (0=automatic).
-y <n>
--max-size-bytes <n>
Maximum amount of memory in the queue, in bytes (0=automatic).
-b <n>
--max-size-buffers <n>
Maximum number of buffers in the queue (0=automatic).
--window-x <n>
X coordinate for player window (for non-overlay rendering).
--window-y <n>
Y coordinate for player window (for non-overlay rendering).
--window-width <n>
Window width (for non-overlay rendering).
--window-height <n>
Window height (for non-overlay rendering).
--disable-fullscreen
Play video in non-full-screen mode (for nveglglessink).
-k <seconds>
--image-display-time <seconds>
Image display time, in seconds.
--show-tags
Shows tags (metadata), if available.
--stats
Shows stream statistics, if enabled.
--stats-file
File to dump stream statistics, if enabled.
--svd=<chain>
Where <chain> is a chain of GStreamer elements
Chain to use for video decoding.
--sad=<chain>
Chain to use for audio decoding.
--svc=<chain>
Chain to use for video postprocessing.
--sac=<chain>
Chain to use for audio postprocessing.
--svs=<chain>
Chain to use for video rendering.
--sas=<chain>
Chain to use for audio rendering.
--shttp=<chain>
Chain to use for http source.
--srtsp=<chain>
Chain to use for rtsp source.
--sudp=<chain>
Chain to use for udp source.
--sfsrc=<chain>
Chain to use for file source.
nvgstplayer Runtime Commands
This table describes nvgstplayer runtime commands.
Command
Description Example
h
Help
q
Quit
Up Arrow
]
Go to next track.
c
Restart current track.
Down Arrow
[
Go to previous track.
spos
Query for position (time from start).
sdur
Query for duration.
s<n>
Seek to <n> seconds from start.
e.g. s5.120
v<pct>
Seek to <pct> percent of the duration.
e.g. v54
f<val>
Shift <n> seconds relative to current position.
e.g. f23.901
Left Arrow
<
Seek backward 10 seconds.
Right Arrow
>
Seek forward 10 seconds.
p
Pause playback.
r
Start/resume playback.
z
Stop playback.
i:<uri>
Enter a single URI.
Video Encoder Features
The GStreamer-1.0-based gst-omx video encoders support the following features, respectively:
Video Encoder Feature
H264enc
H265enc
Vp8enc
Vp9enc
profile (Baseline / Main / High)
ü (all)
ü (Main)
ü
ü
level
ü
ü
—
—
bitrate
ü
ü
ü
ü
peak bitrate
ü
ü
—
—
stringent bitrate
ü
ü
—
—
insert-spsppsatidr
ü
ü
ü
ü
control-rate
ü
ü
ü
ü
iframeinterval
ü
ü
ü
ü
qp-range
ü
ü
ü
ü
temporal-tradeoff
ü
ü
ü
ü
bit-packetization
ü
ü
ü
ü
preset-level
ü
ü
ü
ü
low-latency
ü
ü
ü
ü
slice-header spacing
ü
ü
—
—
force-IDR
ü
ü
ü
ü
vbv-size
ü
ü
ü
ü
sliceintrarefreshenable
ü
ü
—
—
sliceintrarefreshinterval
ü
ü
—
—
EnableTwoPassCBR
ü
ü
ü
ü
num-B-Frames
ü
—
—
—
The GStreamer-1.0-based gst-v4l2 video encoders support the following features, respectively :
Video Encoder Feature
H264enc
H265enc
Vp8enc
Vp9enc
profile (Baseline / Main / High)
ü (all)
ü (Main)
ü
ü
control-rate
ü
ü
ü
ü
bitrate
ü
ü
ü
ü
insert-spsppsatidr
ü
—
—
—
profile
ü
—
—
—
quantization range for I, P and B frame
ü
ü
—
—
iframeinterval
ü
ü
ü
ü
qp-range
ü
ü
—
—
bit-packetization
ü
ü
—
—
preset-level
ü
ü
ü
ü
slice-header spacing
ü
ü
—
—
force-IDR
ü
ü
ü
ü
EnableTwoPassCBR
ü
ü
—
—
Enable cabac-entropy-coding
ü
—
—
—
Enable MVBufferMeta
ü
ü
—
—
Insert aud
ü
ü
—
—
Insert vui
ü
ü
—
—
num-B-Frames
ü
—
—
—
Supported Cameras
This section describes the supported cameras.
CSI Cameras
NVIDIA® Jetson Nano™, NVIDIA® Jetson AGX Xavier™, and NVIDIA® Jetson™ TX2 can capture camera images via CSI interface.
Jetson Nano, Jetson AGX Xavier, and Jetson TX2 all support both YUV and RAW Bayer capture data.
GStreamer supports simultaneous capture from multiple CSI cameras. Support is validated using the nvgstcapture application.
Capture is validated for SDR, PWL HDR and DOL HDR modes for various sensors using the nvgstcapture application.
• Jetson AGX Xavier and Jetson TX2 also support the MIPI CSI virtual channel feature. The virtual channel is a unique channel identifier used for multiplexed sensor streams sharing the same CSI port/brick and CSI stream through supported GMSL (Gigabit Multimedia Serial Link) aggregators.
• GMSL + VC capture is validated on Jetson AGX Xavier and Jetson TX2 using the nvgstcapture application. The reference GMSL module (MAX9295-serializer/MAX9296-deserializer/IMX390-sensor) is used for validation purposes.
USB 2.0 Cameras
The following camera has been validated on Jetson platforms running Jetson Linux Driver Package with USB 2.0 ports. This camera is UVC compliant.