.. _SD.Multimedia.AcceleratedGstreamer: .. include:: /content/swdocs.rsts .. spelling:: actmon bitrate br camerasrc cdn dec DivX dsink EGLimage ep gbr gcdn gep gicr gmo gpcr gso gst gstreamer gvcr gwb intermode Intra jx Kp macroblock macroblocks maxperf Nano nv Nvarguscamera nvarguscamerasrc nvcompositor nvdrmvideosink nveglglessink nveglstreamsrc nvegltransform nvgstcam nvgstcapture nvgstenc nvgstplayer nvivafilter nvjpegdec nvjpegenc nvoverlaylink nvoverlaysink nvoverlysink nvv nvvidconv nvvideosink omx omxh omxmpeg omxvp opencv pcr perf poc pre src th Theora Transcode transcoding unsetting vbv videocuda videodec videosink vp wb whitebalance Xorg xvimagesink YUV Accelerated GStreamer !!!!!!!!!!!!!!!!!!!!! This topic is a guide to the GStreamer version 1.0 and 1.14 based accelerated solution included in |NVIDIA(r)| |Jetson(tm)| Linux. .. todo:: "Solution" makes it sound like some kind of application that uses GStreamer. Wouldn't it be more natural to call it "a hardware-accelerated version of GStreamer"? As a related issue, is it necessary to say "version 1.0" over and over? It seems simpler just to say "GStreamer." We've already stated which versions Jetson Linux supports (although the situation seems to be much more complex than claimed here, see later comments). As a further related issue, the term "GStreamer-1.0" is used repeatedly, but is never explained. It looks like the name of a software element, although using upper case letters in the name of a software element would be unusual. Perhaps it's just another way of writing "GStreamer version 1.0." .. note:: References to GStreamer version 1.0 also apply to GStreamer version 1.16. GStreamer-1.0 Installation and Setup @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ This section explains how to install and configure GStreamer. .. todo:: There's another section on installing ("building") GStreamer much later, at the start of the GStreamer part of the topic. I plan to consolidate this section with that one. I need some information to co-ordinate them. The procedures described here (install with apt) and there (build with gst-install, or build manually) are completely different, and the reader is given no direction for choosing one. The information about versions is also very spotty: above we say that Jetson Linux runs with GStreamer 1.0 or 1.14, but this section doesn't explain how to install 1.14 (assuming it installs 1.0 as given, which isn't clearly stated). The gst-install procedure doesn't say what versions it works with, but has an example with v1.16.2. The manual procedure instructions appear to say that it works only with the latest version, which it calls 1.16.2, but in fact the latest stable version is 1.18.4. To install GStreamer-1.0 ######################## - Enter the commands:: $ sudo apt-get update $ sudo apt-get install gstreamer1.0-tools gstreamer1.0-alsa \ gstreamer1.0-plugins-base gstreamer1.0-plugins-good \ gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly \ gstreamer1.0-libav $ sudo apt-get install libgstreamer1.0-dev \ libgstreamer-plugins-base1.0-dev \ libgstreamer-plugins-good1.0-dev \ libgstreamer-plugins-bad1.0-dev To check the GStreamer-1.0 version ################################## - Enter the command:: $ gst-inspect-1.0 --version GStreamer-1.0 Plugin Reference ############################## .. note:: The ``gst-omx`` plugin is no longer supported in |NVIDIA(r)| Tegra\ |reg| Linux Driver Package (now Jetson Linux) release 34.1. Use the ``gst-v4l2`` plugin for development. .. todo:: I don't think the reader needs to know that ``gst-omx`` was deprecated in r32.1, which will be eight releases old by the time r34.1 is published. I understand that we normally remove a feature in the next minor release after it is deprecated. Has that happened yet? If so, we should say it is no longer supported, or just not mention it at all. As a related issue, it seems inappropriate to give extensive information about the use of gst-omx when it has long been deprecated if not removed. The whole point of deprecating a feature is to inform the reader that it is not to be used for development. Providing extensive information about how to use it will cause confusion: faced with huge amounts of information about gst-omx throughout the topic, the reader is likely to forget that it is deprecated, or not notice in the first place. Readers who need information about it to maintain existing code can refer to an old version of the document. >> @Jonathan, we agree to your point. Doing the cleanup. GStreamer version 1.0 includes the following ``gst-v4l2`` video decoders: +---------------+--------------------------------------------+ | Video decoder | Description | +===============+============================================+ | nvv4l2decoder | V4L2 H.265 Video decoder | | +--------------------------------------------+ | | V4L2 H.264 Video decoder | | +--------------------------------------------+ | | V4L2 VP8 video decoder | | +--------------------------------------------+ | | V4L2 VP9 video decoder | | +--------------------------------------------+ | | V4L2 MPEG4 video decoder | | +--------------------------------------------+ | | V4L2 MPEG2 video decoder | | +--------------------------------------------+ | | V4L2 AV1 video decoder (supported on | | | |NVIDIA(r)| Jetson AGX Orin only) | +---------------+--------------------------------------------+ GStreamer version 1.0 includes the following ``gst-v4l2`` video encoders: +----------------+----------------------------------------------------+ | Video encoder | Description | +================+====================================================+ | nvv4l2h264enc | V4L2 H.264 video encoder | +----------------+----------------------------------------------------+ | nvv4l2h265enc | V4L2 H.265 video encoder | +----------------+----------------------------------------------------+ | nvv4l2vp9enc | V4L2 VP9 video encoder (supported with |NVIDIA(r)| | | | |Jetson Xavier(tm) NX| series and Jetson AGX Xavier| | | series only) | | | | | | .. todo:: | | | Are they supported on Orin? If so, the | | | limitation is no longer needed. | | | >> not supported. Above limitation is correct | | | | +----------------+----------------------------------------------------+ | nvv4l2av1enc | V4L2 AV1 video encoder (supported with |NVIDIA(r)| | | | Jetson AGX Orin only) | +----------------+----------------------------------------------------+ GStreamer version 1.0 includes the following EGL\ |tm| image video sink: .. todo:: This topic and others refer often to "EGLimage." Which form is correct or preferred? +---------------+-----------------------------------------------------+ | Video sink | Description | +===============+=====================================================+ | nveglglessink | EGL/GLES video sink element, support both the X11 | | | and Wayland backends | | | | | | .. todo:: | | | Something is missing before "both." should we | | | say "used in" both the X11 and Wayland backends? | +---------------+-----------------------------------------------------+ | nv3dsink | EGL/GLES video sink element | +---------------+-----------------------------------------------------+ .. todo:: In the preceding table and others, the heading refers to "Video sink" and the body to "videosink." Which is preferred? >> video sink GStreamer version 1.0 includes the following DRM video sink: ============== ====================== Video sink Description ============== ====================== nvdrmvideosink DRM video sink element ============== ====================== .. todo:: The comments about gst-omx apply also to nvoverlaysink, since they were deprecated at the same time. Later several examples use nvoverlaysink, and need to be updated or deleted. I note that nvoverlaylink was introduced three tables above, then we passed onward to the EGL image sink and the DRM image sink; only then do we say, "Oh, by the way, nvoverlysink was deprecated eight releases ago, so don't use it!" Even if there is still reason to mention it, we should organize the information better than this. >> Done Jonathan GStreamer version 1.0 includes the following proprietary NVIDIA plugins: +---------------------------+-----------------------------------------+ | NVIDIA proprietary plugin | Description | +===========================+=========================================+ | nvarguscamerasrc | Camera plugin for ARGUS API | | | | | | .. todo:: | | | ARGUS is used in all caps in several | | | places, something I have not seen | | | anywhere else. Is there a reason why | | | it's used that way here? | | | | +---------------------------+-----------------------------------------+ | nvv4l2camerasrc | Camera plugin for V4L2 API | +---------------------------+-----------------------------------------+ | nvvidconv | Video format conversion and scaling | +---------------------------+-----------------------------------------+ | nvcompositor | Video compositor | +---------------------------+-----------------------------------------+ | nveglstreamsrc | Acts as GStreamer Source Component, | | | accepts EGLStream from EGLStream | | | producer | +---------------------------+-----------------------------------------+ | nvvideosink | Video Sink Component. Accepts YUV-I420 | | | format and produces EGLStream (RGBA) | +---------------------------+-----------------------------------------+ | nvegltransform | Video transform element for NVMM to | | | EGLimage (supported with nveglglessink | | | only) | +---------------------------+-----------------------------------------+ GStreamer version 1.0 includes the following ``libjpeg``\ -based JPEG image video encode/decode plugins: ========= ==================== JPEG Description ========= ==================== nvjpegenc JPEG encoder element nvjpegdec JPEG decoder element ========= ==================== .. note:: Enter this command before starting the video decode pipeline using ``gst-launch`` or ``nvgstplayer``:: $ export DISPLAY=:0 Enter this command to start X server if it is not already running:: $ xinit & Decode Examples @@@@@@@@@@@@@@@ The examples in this section show how you can perform audio and video decode with GStreamer. .. todo:: Release 24.2 is *24 releases old*. Is it really necessary to caution the reader against using components that far out of date? (If it is, this is the wrong place to do it.) Audio Decode Examples Using gst-launch-1.0 ########################################## The following examples show how you can perform audio decode using GStreamer-1.0. - AAC Decode (OSS Software Decode):: $ gst-launch-1.0 filesrc location= ! \ qtdemux name=demux demux.audio_0 ! \ queue ! avdec_aac ! audioconvert ! alsasink -e .. todo:: PDF documents have a fixed column width, so we customarily continue commands after a maximum of 80 characters to avoid line wrapping. HTML documents allow the reader to avoid line wrapping by temporarily widening the browser window, so we customarily continue commands only if they are very long (e.g. over 120 characters). Can we observe that convention and consolidate each command on a longer line? - AMR-WB Decode (OSS Software Decode):: $ gst-launch-1.0 filesrc location= ! \ qtdemux name=demux demux.audio_0 ! queue ! avdec_amrwb ! \ audioconvert ! alsasink -e - AMR-NB Decode (OSS Software Decode):: $ gst-launch-1.0 filesrc location= ! \ qtdemux name=demux demux.audio_0 ! queue ! avdec_amrnb ! \ audioconvert ! alsasink -e - MP3 Decode (OSS Software Decode):: $ gst-launch-1.0 filesrc location= ! mpegaudioparse ! \ avdec_mp3 ! audioconvert ! alsasink -e .. note:: To route audio over |HDMI(r)|, set the ``alsasink`` property ``device`` to the value given for your platform in the table :ref:`Port to device ID map ` in the topic :ref:`Audio Setup and Development `. For example, use ``device=hw:0,7`` to route audio over the Jetson TX2 HDMI/DP 1 (HDMI) port. .. todo:: Replace outdated TX2 example. I tried to do this but I don't understand how to read the table. There is no entry for TX2 which contains values 0 and 7. Perhaps 0 is a fixed value rather than a table value, or the table is wrong. Video Decode Examples Using gst-launch-1.0 ########################################## The following examples show how you can perform video decode on GStreamer-1.0. Video Decode Using gst-v4l2 ########################### The following examples show how you can perform video decode using the ``gst-v4l2`` plugin on GStreamer-1.0. - H.264 Decode (NVIDIA Accelerated Decode):: $ gst-launch-1.0 filesrc location= ! \ qtdemux ! queue ! h264parse ! nvv4l2decoder ! nv3dsink -e .. note:: To enable max perf mode, use the ``enable-max-performance`` property of the ``gst-v4l2`` decoder plugin. Expect increased power consumption in max perf mode. For example:: $ gst-launch-1.0 filesrc location= ! \ qtdemux ! queue ! h264parse ! nvv4l2decoder \ enable-max-performance=1 ! nv3dsink -e .. note:: To decode H.264/H.265 GDR streams you must enable error reporting by setting the property ``enable-frame-type-reporting`` to ``true``. For example:: $ gst-launch-1.0 filesrc \ location= ! \ qtdemux ! queue ! h264parse ! nvv4l2decoder \ enable-frame-type-reporting=1 ! nv3dsink -e - H.265 Decode (NVIDIA Accelerated Decode):: $ gst-launch-1.0 filesrc location= ! \ qtdemux ! queue ! h265parse ! nvv4l2decoder ! nv3dsink -e - 10-bit H.265 Decode (NVIDIA Accelerated Decode):: $ gst-launch-1.0 filesrc location= ! \ matroskademux ! queue ! h265parse ! nvv4l2decoder ! \ nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)NV12' ! \ nv3dsink -e - 12-bit H.265 Decode (NVIDIA Accelerated Decode):: $ gst-launch-1.0 filesrc location= ! \ matroskademux ! queue ! h265parse ! nvv4l2decoder ! \ nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)NV12' ! \ nv3dsink -e - 8-bit YUV444 (NV24) H.265 Decode (NVIDIA Accelerated Decode):: $ gst-launch-1.0 filesrc location= ! \ h265parse ! nvv4l2decoder ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)NV12' ! \ nv3dsink -e - VP9 Decode (NVIDIA Accelerated Decode):: $ gst-launch-1.0 filesrc location= ! \ matroskademux ! queue ! nvv4l2decoder ! nv3dsink -e - VP8 Decode (NVIDIA Accelerated Decode):: $ gst-launch-1.0 filesrc location= ! \ matroskademux ! queue ! nvv4l2decoder ! nv3dsink -e - MPEG-4 Decode (NVIDIA Accelerated Decode):: $ gst-launch-1.0 filesrc location= ! \ qtdemux ! queue ! mpeg4videoparse ! nvv4l2decoder ! nv3dsink -e - MPEG-4 Decode DivX 4/5 (NVIDIA Accelerated Decode):: $ gst-launch-1.0 filesrc location= ! \ avidemux ! queue ! mpeg4videoparse ! nvv4l2decoder ! nv3dsink -e - MPEG-2 Decode (NVIDIA Accelerated Decode):: $ gst-launch-1.0 filesrc location= ! \ tsdemux ! queue ! mpegvideoparse ! nvv4l2decoder ! nv3dsink -e - AV1 Decode (NVIDIA Accelerated Decode):: $ gst-launch-1.0 filesrc location = ! \ matroskademux ! queue ! nvv4l2decoder ! nv3dsink -e Image Decode Examples Using gst-launch-1.0 $$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$ The following example shows how you can perform JPEG decode on GStreamer-1.0. - JPEG Decode (NVIDIA Accelerated Decode):: $ gst-launch-1.0 filesrc location= ! nvjpegdec ! \ imagefreeze ! xvimagesink -e Encode Examples @@@@@@@@@@@@@@@ The examples in this section show how you can perform audio and video encode with GStreamer. Audio Encode Examples Using gst-launch-1.0 ########################################## The following examples show how you can perform audio encode on GStreamer-1.0. - AAC Encode (OSS Software Encode):: $ gst-launch-1.0 audiotestsrc ! \ 'audio/x-raw, format=(string)S16LE, layout=(string)interleaved, rate=(int)44100, channels=(int)2' ! \ voaacenc ! qtmux ! filesink location=test.mp4 -e - AMR-WB Encode (OSS Software Encode):: $ gst-launch-1.0 audiotestsrc ! \ 'audio/x-raw, format=(string)S16LE, layout=(string)interleaved, \ rate=(int)16000, channels=(int)1' ! voamrwbenc ! qtmux ! \ filesink location=test.mp4 -e Video Encode Examples Using gst-launch-1.0 ########################################## The following examples show how you can perform video encode with GStreamer-1.0. Video Encode Using gst-v4l2 $$$$$$$$$$$$$$$$$$$$$$$$$$$ The following examples show how you can perform video encode using ``gst-v4l2`` plugin with GStreamer-1.0. - H.264 Encode (NVIDIA Accelerated Encode):: $ gst-launch-1.0 nvarguscamerasrc ! \ 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \ format=(string)NV12, framerate=(fraction)30/1' ! nvv4l2h264enc ! \ bitrate=8000000 ! h264parse ! qtmux ! filesink \ location= -e .. note:: To enable max perf mode, use the maxperf-enable property of the ``gst-v4l2`` encoder plugin. Expect increased power consumption in max perf mode. For example:: $ gst-launch-1.0 nvarguscamerasrc ! \ 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \ format=(string)NV12, framerate=(fraction)30/1' ! nvv4l2h264enc \ maxperf-enable=1 bitrate=8000000 ! h264parse ! qtmux ! filesink \ location= -e - 8-bit YUV444 (NV24) H.264 Encode (NVIDIA Accelerated Encode):: $ gst-launch-1.0 filesrc location=! \ videoparse width=352 height=288 format=52 framerate=30 ! \ 'video/x-raw, format=(string)NV24' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)NV24' ! nvv4l2h264enc \ profile=High444 ! h264parse ! filesink \ location= -e .. note:: 8-bit YUV444 H.264 encode is supported with High444 profile. - H.265 Encode (NVIDIA Accelerated Encode):: $ gst-launch-1.0 nvarguscamerasrc ! \ 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \ format=(string)NV12, framerate=(fraction)30/1' ! nvv4l2h265enc \ bitrate=8000000 ! h265parse ! qtmux ! filesink \ location= -e .. note:: Jetson AGX Xavier and Jetson AGX Orin can support 8Kp30 H.265 encode. For example:: $ gst-launch-1.0 nvarguscamerasrc ! \ 'video/x-raw(memory:NVMM), width=(int)3840, \ height=(int)2160, format=(string)NV12, \ framerate=(fraction)30/1' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), width=(int)7860, \ height=(int)4320, format=(string)NV12 ! nvv4l2h265enc \ preset-level=1 control-rate=1 bitrate=40000000 ! \ h265parse ! matroskamux ! \ filesink location= -e .. todo:: These examples' line divisions passed without comment in r32.6.1, and probably several earlier releases, but I wonder whether a continuation character inside a quoted string is really legal. The Gnu Bash shell documentation says no: "If a \\newline pair appears, *and the backslash itself is not quoted*, the \\newline is treated as a line continuation." It also says, "A single quote may not occur between single quotes, even when preceded by a backslash," implying that a backslash inside single quotes is not treated as an escape character at all. - 10-bit H.265 Encode (NVIDIA Accelerated Encode):: $ gst-launch-1.0 nvarguscamerasrc ! \ 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \ format=(string)NV12, framerate=(fraction)30/1' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)P010_10LE' ! \ nvv4l2h265enc bitrate=8000000 ! h265parse ! qtmux ! \ filesink location= -e - 8-bit YUV444 (NV24) H.265 Encode (NVIDIA Accelerated Encode):: $ gst-launch-1.0 filesrc location= ! \ videoparse width=352 height=288 format=52 framerate=30 ! \ 'video/x-raw, format=(string)NV24' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)NV24' ! nvv4l2h265enc \ profile=Main ! h265parse ! filesink location= -e .. note:: 8-bit YUV444 H.265 encode is supported with Main profile. - VP9 Encode (NVIDIA Accelerated Encode):: $ gst-launch-1.0 nvarguscamerasrc ! \ 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \ format=(string)NV12, framerate=(fraction)30/1' ! nvv4l2vp9enc \ bitrate=8000000 ! matroskamux ! filesink \ location= -e .. note:: Jetson AGX Orin does not support VP9 encode using gst-v4l2. - VP9 Encode with IVF Headers (NVIDIA Accelerated Encode):: $ gst-launch-1.0 nvarguscamerasrc ! \ 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \ format=(string)NV12, framerate=(fraction)30/1' ! nvv4l2vp9enc \ enable-headers=1 bitrate=8000000 ! filesink \ location= -e - AV1 Encode (NVIDIA Accelerated Encode):: $ gst-launch-1.0 nvarguscamerasrc ! \ 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \ format=(string)NV12, framerate=(fraction)30/1' ! nvv4l2av1enc \ bitrate=20000000 ! webmmux ! filesink \ location= -e .. note:: AV1 encode using gst-v4l2 is supported only Jetson Orin. - AV1 Encode with IVF Headers (NVIDIA Accelerated Encode):: $ gst-launch-1.0 nvarguscamerasrc ! \ 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \ format=(string)NV12, framerate=(fraction)30/1' ! nvv4l2av1enc \ enable-headers=1 bitrate=8000000 ! filesink \ location= -e Image Encode Examples Using gst-launch-1.0 ########################################## The following example shows how you can perform JPEG encode on GStreamer-1.0. - Image Encode:: $ gst-launch-1.0 videotestsrc num-buffers=1 ! \ 'video/x-raw, width=(int)640, height=(int)480, \ format=(string)I420' ! nvjpegenc ! filesink location=test.jpg -e Supported H.264/H.265/VP9/AV1 Encoder Features with GStreamer-1.0 ################################################################# This section describes example gst-launch-1.0 usage for features supported by the NVIDIA accelerated H.264/H.265/VP9/AV1 encoders. Features Supported Using gst-v4l2 $$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$ This section describes example gst-launch-1.0 usage for features supported by the NVIDIA accelerated H.264/H.265/VP8/VP9 ``gst-v4l2`` encoders. .. note:: Display detailed information on the ``nvv4l2h264enc``, ``nvv4l2h265enc``, ``v4l2vp9enc``, or ``nvv4l2vp8enc`` encoder property with the command:: $ gst-inspect-1.0 [nvv4l2h264enc | nvv4l2h265enc | nvv4l2vp9enc | nvv4l2av1enc] - Set I-frame interval (supported with H.264/H.265/VP9/AV1 encode):: $ gst-launch-1.0 videotestsrc num-buffers=300 ! \ 'video/x-raw, width=(int)1280, height=(int)720, \ format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \ iframeinterval=100 ! h264parse ! qtmux ! filesink \ location= -e This property sets encoding Intra Frame occurrence frequency. - Set rate control mode and bitrate (supported with H.264/H.265/VP9/AV1 encode): The supported modes are 0 (variable bit rate, or VBR) and 1 (constant bit rate, or CBR). - Set variable bitrate mode:: $ gst-launch-1.0 videotestsrc num-buffers=300 ! \ 'video/x-raw, width=(int)1280, height=(int)720, \ format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \ control-rate=0 bitrate=30000000 ! h264parse ! qtmux ! filesink \ location= -e .. note:: AV1 codec does not currently support VBR mode. - Set constant bitrate mode:: $ gst-launch-1.0 videotestsrc num-buffers=300 ! \ 'video/x-raw, width=(int)1280, height=(int)720, \ format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \ control-rate=1 bitrate=30000000 ! h264parse ! qtmux ! filesink \ location= -e - Set peak bitrate:: $ gst-launch-1.0 videotestsrc num-buffers=300 ! \ 'video/x-raw, width=(int)1280, height=(int)720, \ format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \ bitrate=6000000 peak-bitrate=6500000 ! h264parse ! qtmux ! \ filesink location= -e Peak bitrate takes effect only in variable bit rate mode (``control-rate=0``). By default, the value is configured as (1.2\ |times|\ bitrate). - Set quantization parameter for I, P and B frame (supported with H.264/H.265 encode):: $ gst-launch-1.0 videotestsrc num-buffers=300 ! \ 'video/x-raw, width=(int)1280, height=(int)720, \ format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \ ratecontrol-enable=0 quant-i-frames=30 quant-p-frames=30 \ quant-b-frames=30 num-B-Frames=1 ! filesink \ location= -e The range of B frames does not take effect if the number of B frames is 0. - Set quantization range for I, P and B Frame (supported with H.264/H.265 encode). The format for the range is:: "::" Where ````, ````, and ```` are each expressed in the form ,, as in this example:: $ gst-launch-1.0 videotestsrc num-buffers=300 ! \ 'video/x-raw, width=(int)1280, height=(int)720, \ format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \ qp-range="24,24:28,28:30,30" num-B-Frames=1 ! 'video/x-h264, \ stream-format=(string)byte-stream, alignment=(string)au' ! filesink \ location= -e - Set hardware preset level (supported with H.264/H.265/VP9/AV1 encode):: $ gst-launch-1.0 videotestsrc num-buffers=300 ! \ 'video/x-raw, width=(int)1280, height=(int)720, \ format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \ preset-level=4 MeasureEncoderLatency=1 ! 'video/x-h264, \ stream-format=(string)byte-stream, alignment=(string)au' ! \ filesink location= -e The following modes are supported: - 0: **DisablePreset**. - 1: **UltraFastPreset**. - 2: **FastPreset**: Only integer pixel (``integer-pel``) block motion is estimated. For I/P macroblock mode decisions, only Intra 16\ |times|\ 16 cost is compared with intermode costs. Supports Intra 16\ |times|\ 16 and Intra 4\ |times|\ 4 modes. - 3: **MediumPreset**: Supports up to half pixel (``half-pel``) block motion estimation. For I/P macroblock mode decisions, only Intra 16\ |times|\ 16 cost is compared with intermode costs. Supports Intra 16\ |times|\ 16 and Intra 4\ |times|\ 4 modes. - 4: **SlowPreset**: Supports up to quarter pixel (``Qpel``) block motion estimation. For I/P macroblock mode decisions, Intra 4\ |times|\ 4 as well as Intra 16\ |times|\ 16 cost is compared with intermode costs. Supports Intra 16\ |times|\ 16 and Intra 4\ |times|\ 4 modes. .. note:: AV1 codec currently supports only UltraFastPreset and FastPreset. - Set profile (supported with H.264/H.265 encode):: $ gst-launch-1.0 videotestsrc num-buffers=300 ! \ 'video/x-raw, width=(int)1280, height=(int)720, \ format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \ profile=0 ! 'video/x-h264, stream-format=(string)byte-stream, \ alignment=(string)au' ! filesink location= -e The following profiles are supported for H.264 encode: - 0: Baseline profile - 2: Main profile - 4: High profile The following profiles are supported for H.265 encode: - 0: Main profile - 1: Main10 profile - Insert SPS and PPS at IDR (supported with H.264/H.265 encode):: $ gst-launch-1.0 videotestsrc num-buffers=300 ! \ 'video/x-raw, width=(int)1280, height=(int)720, \ format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \ insert-sps-pps=1 ! \ 'video/x-h264, stream-format=(string)byte-stream, \ alignment=(string)au' ! filesink location= -e If enabled, a sequence parameter set (SPS) and a picture parameter set (PPS) are inserted before each IDR frame in the H.264/H.265 stream. - Enable two-pass CBR (supported with H.264/H.265 encode):: $ gst-launch-1.0 videotestsrc num-buffers=300 ! \ 'video/x-raw, width=(int)1280, height=(int)720, \ format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \ control-rate=1 bitrate=10000000 EnableTwopassCBR=1 ! \ 'video/x-h264, stream-format=(string)byte-stream, \ alignment=(string)au' ! filesink location= -e Two-pass CBR must be enabled along with constant bit rate (``control-rate=1``). .. note:: For multi-instance encode with two-pass CBR enabled, enable max perf mode by using the maxperf-enable property of the ``gst-v4l2`` encoder to achieve best performance. Expect increased power consumption in max perf mode. - Slice-header-spacing with spacing in terms of macroblocks (Supported with H.264/H.265 encode):: $ gst-launch-1.0 videotestsrc num-buffers=300 ! \ 'video/x-raw, width=(int)1280, height=(int)720, \ format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \ slice-header-spacing=8 bit-packetization=0 ! 'video/x-h264, \ stream-format=(string)byte-stream, alignment=(string)au' ! \ filesink location= -e The ``bit-packetization=0`` parameter configures the network abstraction layer (NAL) packet as macroblock (MB)-based, and ``slice-header-spacing=8`` configures each NAL packet as 8\ |nbsp|\ macroblocks maximum. .. todo:: We're using "slice header spacing" both with and without hyphenation. Which form is correct? The former suggests the name of a technique or mode, the latter a setting or parameter name (which should be in code font and should not be capitalized). - Slice header spacing with spacing in terms of number of bits (supported with H.264/H.265 encode):: $ gst-launch-1.0 videotestsrc num-buffers=300 ! \ 'video/x-raw, width=(int)1280, height=(int)720, \ format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \ slice-header-spacing=1400 bit-packetization=1 ! 'video/x-h264, \ stream-format=(string)byte-stream, alignment=(string)au' ! \ filesink location= -e The parameter ``bit-packetization=1`` configures the network abstraction layer (NAL) packet as size-based, and ``slice-header-spacing=1400`` configures each NAL packet as 1400\ |nbsp|\ bytes maximum. - Enable CABAC-entropy-coding (supported with H.264 encode for main or high profile):: $ gst-launch-1.0 videotestsrc num-buffers=300 ! \ 'video/x-raw, width=(int)1280, height=(int)720, \ format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \ profile=2 cabac-entropy-coding=1 ! 'video/x-h264, \ stream-format=(string)byte-stream, alignment=(string)au' ! \ filesink location= -e The following entropy coding types are supported: - 0: CAVLC - 1: CABAC - Set number of B frames between two reference frames (supported with H.264/H.265 encode):: $ gst-launch-1.0 videotestsrc num-buffers=300 ! \ 'video/x-raw, width=(int)1280, height=(int)720, \ format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \ num-B-Frames=1 ! 'video/x-h264, stream-format=(string)byte-stream, \ alignment=(string)au' ! filesink location= -e This property sets the number of B frames between two reference frames. .. note:: For multi-instance encode with ``num-B-Frames=2``, enable max perf mode by specifying the maxperf-enable property of the ``gst-v4l2`` encoder for best performance. Expect increased power consumption in max perf mode. - Enable motion vector metadata (supported with H.264/H.265 encode):: $ gst-launch-1.0 videotestsrc num-buffers=300 ! \ 'video/x-raw, width=(int)1280, height=(int)720, \ format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \ EnableMVBufferMeta=1 ! 'video/x-h264, \ stream-format=(string)byte-stream, alignment=(string)au' ! \ filesink location= -e - Set virtual buffer size:: $ gst-launch-1.0 videotestsrc num-buffers=300 ! \ 'video/x-raw, width=(int)1280, height=(int)720, \ format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \ vbv-size=10 ! h264parse ! qtmux ! \ filesink location= -e If the buffer size of the decoder or network bandwidth is limited, configuring virtual buffer size can cause the video stream generation to correspond to the limitations based on the following formula: virtual buffer size = vbv-size |times| (bitrate/fps) - Insert AUD (supported with H.264/H.265 encode):: $ gst-launch-1.0 videotestsrc num-buffers=300 ! \ 'video/x-raw, width=(int)1280, height=(int)720, \ format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \ insert-aud=1 ! 'video/x-h264, stream-format=(string)byte-stream, \ alignment=(string)au' ! filesink location= -e This property inserts an H.264/H.265 Access Unit Delimiter (AUD). - Insert VUI (supported with H.264/H.265 encode):: $ gst-launch-1.0 videotestsrc num-buffers=300 ! \ 'video/x-raw, width=(int)1280, height=(int)720, \ format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc \ insert-vui=1 ! 'video/x-h264, stream-format=(string)byte-stream, \ alignment=(string)au' ! filesink location= -e This property inserts H.264/H.265 video usability information (VUI) in SPS. - Set picture order count (POC) type (supported with H.264 encode):: $ gst-launch-1.0 videotestsrc num-buffers=300 ! \ 'video/x-raw, width=1920, height=1080, format=I420' ! nvvidconv ! \ nvv4l2h264enc \ poc-type=2 ! h264parse ! filesink location= -e The following values are supported for the poc-type property: - 0: POC explicitly specified in each slice header (the default) - 2: Decoding/coding order and display order are the same - Set Disable CDF Update (supported with AV1 encode):: $ gst-launch-1.0 videotestsrc num-buffers=300 ! \ 'video/x-raw, width=1920, height=1080, format=I420' ! nvvidconv ! \ nvv4l2av1enc \ disable-cdf=0 enable-headers=1 ! filesink location= -e - Set Tile Configuration (supported with AV1 encode): - For 1x2 Tile configuration:: $ gst-launch-1.0 videotestsrc num-buffers=30 ! \ 'video/x-raw, width=1920, height=1080, format=I420' ! nvvidconv ! \ nvv4l2av1enc \ tiles="1,0" bitrate=20000000 ! qtmux ! \ filesink location= -e - For 2x1 Tile configuration:: $ gst-launch-1.0 videotestsrc num-buffers=30 ! \ 'video/x-raw, width=1920, height=1080, format=I420' ! nvvidconv ! \ nvv4l2av1enc \ tiles="0,1" bitrate=20000000 ! qtmux ! \ filesink location= -e - For 2x2 Tile configuration:: $ gst-launch-1.0 videotestsrc num-buffers=30 ! \ 'video/x-raw, width=1920, height=1080, format=I420' ! nvvidconv ! \ nvv4l2av1enc \ preset-level=1 tiles="1,1" bitrate=20000000 ! qtmux ! \ filesink location= -e The feature encode frames as super-macroblocks, with Log2(Rows) and Log2(Columns) as the input. - Set SSIM RDO (supported with AV1 encode):: $ gst-launch-1.0 videotestsrc num-buffers=30 ! \ 'video/x-raw, width=1920, height=1080, format=I420' ! nvvidconv ! \ nvv4l2av1enc \ enable-srdo=1 ! qtmux ! \ filesink location= -e Camera Capture with GStreamer-1.0 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ To display ``nvgstcapture-1.0`` usage information, enter the command:: $ nvgstcapture-1.0 --help .. note:: The ``nvgstcapture-1.0`` application default only supports ARGUS API using the ``nvarguscamerasrc`` plugin. The legacy ``nvcamerasrc`` plugin support is deprecated. For more information, see `nvgstcapture-1.0 Reference <#nvgstcapture-1-0-reference>`__. Camera Capture with GStreamer-1.0 ################################# .. todo:: This subsection has the same heading as the parent section. Please reword at least of them. If you can make them more informative, that's a bonus. Use the following command to capture using ``nvarguscamerasrc`` and preview display with ``nvdrmvideosink``:: $ gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM), \ width=(int)1920, height=(int)1080, format=(string)NV12, \ framerate=(fraction)30/1' ! nvdrmvideosink -e .. note:: The ``nvarguscamerasrc`` plugin’s ``maxperf`` property is removed, as VIC actmon DFS is implemented to handle VIC frequency scaling as per load enabling clients to get required performance. .. todo:: This note may be important or pointless, depending on *when* the property was removed. The references to plugins that were deprecated eight releases back put me on my guard. We should follow our customary practice of pruning notices of removed features after one minor release except where there is a substantial reason to deviate from it. Progressive Capture Using nvv4l2camerasrc ######################################### To capture and preview display with ``nv3dsink``, enter the command:: $ gst-launch-1.0 nvv4l2camerasrc device=/dev/video3 ! \ 'video/x-raw(memory:NVMM), format=(string)UYVY, \ width=(int)1920, height=(int)1080, \ interlace-mode= progressive, \ framerate=(fraction)30/1' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)NV12' ! \ nv3dsink -e .. note:: The ``nvv4l2camerasrc`` plugin default currently supports only DMABUF (importer role) streaming I/O mode with ``V4L2_MEMORY_DMABUF``. The ``nvv4l2camerasrc`` plugin is currently verified using the NVIDIA V4L2 driver with a sensor that supports YUV capture in UYVY format. If you need to use a different type of sensor for capture in other YUV formats, see the topic :ref:`Sensor Software Driver Programming `. In that case ``nvv4l2camerasrc`` must also be enhanced for required YUV format support. The ``nvgstcapture-1.0`` application uses the ``v4l2src`` plugin to capture still images and video. The following table shows USB camera support. +--------------------+--------------------------------------------------------+ | USB camera support | Feature | +====================+========================================================+ | YUV | Preview display | +--------------------+--------------------------------------------------------+ | | Image capture (VGA, 640\ |times|\ 480) | +--------------------+--------------------------------------------------------+ | | Video capture (480p, 720p, H.264/H.265/VP8/VP9 encode) | +--------------------+--------------------------------------------------------+ .. todo:: I don't understand how to read this table, with three rows and nothing in the first column for two of them. Please clarify. Raw-YUV Capture Using v4l2src ############################# Use the following command to capture raw YUV (I420 format) using v4l2src and preview display with xvimagesink:: $ gst-launch-1.0 v4l2src device="/dev/video0" ! \ "video/x-raw, width=640, height=480, format=(string)YUY2" ! \ xvimagesink -e Camera Capture and Encode Support with OpenCV ############################################# The OpenCV sample application ``opencv_nvgstcam`` simulates the camera capture pipeline. Similarly, the OpenCV sample application ``opencv_nvgstenc`` simulates the video encode pipeline. Both sample applications are based on GStreamer 1.0. They currently are supported only by OpenCV version 3.3. .. todo:: Are these statements still current? These cases continue to occur throughout the topic. Please search for "version," "release," and any other appropriate keywords and take appropriate action if they are outdated. It is better to make them version-neutral, rather than update their numbers, where possible. - opencv_nvgstcam: Camera capture and preview. To simulate the camera capture pipeline with the ``opencv_nvgstcam`` sample application, enter the command:: $ ./opencv_nvgstcam --help .. note:: Currently, ``opencv_nvgstcam`` only supports single-instance CSI capture using the ``d`` plugin. You can modify and rebuild the application to support GStreamer pipelines for CSI multi-instance captures and USB camera captures by using the ``v4l2src`` plugin. The application uses an OpenCV-based video sink for display. .. todo:: Is there actually a plugin named ``d``? It looks like a typo. For camera CSI capture and preview rendering with OpenCV, enter th3 command:: $ ./opencv_nvgstcam --width=1920 --height=1080 --fps=30 - opencv_nvgstenc: Camera capture and video encode. To simulate the camera capture and video encode pipeline with the ``opencv_nvgstenc`` sample application, enter the command:: $ ./opencv_nvgstenc --help .. note:: Currently, ``opencv_nvgstenc`` only supports camera CSI capture using the ``nvarguscamerasrc`` plugin and video encode in H.264 format by using the ``nvv4l2h264enc`` plugin with an MP4 container file. You can modify and rebuild the application to support GStreamer pipelines for different video encoding formats. The application uses an OpenCV-based video sink for display. For camera CSI capture and video encode with OpenCV, enter the command:: $ ./opencv_nvgstenc --width=1920 --height=1080 --fps=30 --time=60 \ --filename=test_h264_1080p_30fps.mp4 Video Playback with GStreamer-1.0 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ To display ``nvgstplayer-1.0`` usage information, enter the command:: $ nvgstplayer-1.0 --help Video can be output to HD displays using the HDMI connector on the Jetson device. The ``GStreamer-1.0`` application currently supports the following video sinks: For overlay sink (video playback on overlay in full-screen mode), enter the command:: $ gst-launch-1.0 filesrc location= ! \ qtdemux name=demux ! h264parse ! nvv4l2decoder ! nvdrmvideosink -e .. todo:: "For overlay Sink" may or may not be good English, depending on exactly how the words are being used, but it looks wrong and it's difficult to parse. Would it be accurate to say something like "To use an overlay sink"? Video Playback Examples ####################### The following examples show how you can perform video playback using GStreamer-1.0. - ``nveglglessink`` (windowed video playback, NVIDIA EGL/GLES videosink using default X11 backend): Enter this command to start the GStreamer pipeline using ``nveglglesink`` with the default X11 backend:: $ gst-launch-1.0 filesrc location= ! \ qtdemux name=demux ! h264parse ! nvv4l2decoder ! nveglglessink -e The ``nvgstplayer-1.0`` application accepts command line options that specify window position and dimensions for windowed playback:: $ nvgstplayer-1.0 -i --window-x=300 --window-y=300 \ --window-width=500 --window-height=500 - ``nveglglessink`` (windowed video playback, NVIDIA EGL/GLES videosink using Wayland backend): You can use ``nveglglsink`` with the Wayland backend instead of the default X11 backend. Ubuntu 20.04 does not support the Wayland display server, which means that there is no UI support to switch Wayland from Xorg. You must start the Wayland server (Weston) by using the target's shell before performing Weston-based operations. .. todo:: Jetson Linux now is based on Ubuntu 18.04. Does that affect this limitation? [SK] I have changed it to 20.04 To start Weston, complete the following steps **before** you run the GStreamer pipeline the first time with the Wayland backend. The steps are not required after the initial run. .. todo:: At this point I lost track of the document structure due to the large number of poorly distinguished levels. Is "To start Weston..." a sibling of "nveglglessink (window video playback... using Wayland backend)," or is it a child? [SK] it is a child More generally, we need to reorganize this topic to reduce the number of levels. We have two or three levels of bulleted lists under five levels of section headings. We can't chop up the document this finely and get a coherent result. The reader will lose track of where they are, as I have. Start Weston:: $ nvstart-weston.sh To run the GStreamer pipeline with the Wayland backend, run the following command to start the pipeline and use ``nveglglesink`` with the Wayland backend:: $ gst-launch-1.0 filesrc \ location= ! qtdemux name=demux ! h264parse ! \ nvv4l2decoder ! nveglglessink winsys=wayland - ``nvdrmvideosink`` (video playback using DRM): This sink element uses DRM to render videos on connected displays. The display driver must be stopped, and DRM driver must be loaded before using the ``nvdrmvideosink``. .. todo:: The text doesn't say what the following procedure does. [SK] done I have added text Stop the display manager:: $ sudo systemctl stop gdm $ sudo loginctl terminate-seat seat0 Load the DRM driver:: For Jetson Xavier use $ sudo modprobe tegra_udrm modeset=1 For Jetson Orin use $ sudo modprobe nvidia-drm modeset=1 To start the GStreamer pipeline by using ``nvdrmvideosink``, run the following command:: $ gst-launch-1.0 filesrc location= ! \ qtdemux! queue ! h264parse ! nvv4l2decoder ! nvdrmvideosink -e ``nvdrmvideosink`` supports these properties - ``conn_id``: Set the connector ID for the display. - ``plane_id``: Set the plane ID. - ``set_mode``: Set the default mode (resolution) for playback. The following command illustrates the use of these properties:: $ gst-launch-1.0 filesrc location= ! \ qtdemux! queue ! h264parse ! ! nvv4l2decoder ! nvdrmvideosink \ conn_id=0 plane_id=1 set_mode=0 -e - ``nv3dsink`` video sink (video playback using 3D graphics API): This video sink element works with NVMM buffers and renders using the 3D graphics rendering API. It performs better than ``nveglglessink`` with NVMM buffers. .. todo:: I'm assuming that we're back at the first level of the bulleted list under the heading "Properties," which is a sibling of the heading "Video Playback Examples." The old *Developer Guide* is no guide to document structure because it hierarchy clearly doesn't match its content in this part. [SK] yes, we are back to first level. This command starts the GStreamer pipeline using ``nv3dsink``:: $ gst-launch-1.0 filesrc location= ! \ qtdemux ! queue ! h264parse ! nvv4l2decoder ! nv3dsink -e The sink supports setting a specific window position and dimensions using the properties shown in this example:: $ nv3dsink window-x=300 window-y=300 window-width=512 window-height=512 Video Decode Support with OpenCV ################################ You can simulate a video decode pipeline using the GStreamer-1.0-based OpenCV sample application ``opencv_nvgstdec``. .. note:: The sample application currently operates only with OpenCV version 3.3. To perform video decoding with ``opencv_nvgstdec``, enter the command:: $ ./opencv_nvgstdec --help .. note:: Currently, ``opencv_nvgstdec`` only supports video decode of H264 format using the ``nvv4l2decoder`` plugin. You can modify and rebuild the application to support GStreamer pipelines for video decode of different formats. For display, the application utilizes an OpenCV based video sink component. To perform video decoding with ``opencv_nvgstdec``, enter the command:: $ ./opencv_nvgstdec --file-path=test_file_h264.mp4 Video Streaming with GStreamer-1.0 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ This section describes procedures for video streaming with GStreamer 1.0. To perform video streaming with nvgstplayer-1.0 ############################################### - Using nvgstplayer-1.0: Enter the command:: $ nvgstplayer-1.0 -i rtsp://10.25.20.77:554/RTSP_contents/VIDEO/H264/ test_file_h264.3gp –stats The supported formats for video streaming are: .. raw:: html :file: AcceleratedGstreamer/ToPerformVideoStreamingWithNvgstplayer10.htm - Using gst-launch-1.0 pipeline: - Streaming and video rendering: - Transmitting (from target): CSI camera capture + video encode + RTP streaming using network sink:: $ gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM), \ format=NV12, width=1920, height=1080' ! \ nvv4l2h264enc insert-sps-pps=true ! h264parse ! \ rtph264pay pt=96 ! udpsink host=127.0.0.1 port=8001 sync=false -e - Receiving (on target) : Network Source + video decode + video render:: $ gst-launch-1.0 udpsrc address=127.0.0.1 port=8001 \ caps='application/x-rtp, encoding-name=(string)H264, payload=(int)96' ! \ rtph264depay ! queue ! h264parse ! nvv4l2decoder ! nv3dsink -e - Streaming and file dump: - Transmitting (from target): CSI camera capture + video encode + RTP streaming using network sink:: $ gst-launch-1.0 nvarguscamerasrc ! \ 'video/x-raw(memory:NVMM), format=NV12, width=1920, height=1080' ! \ nvv4l2h264enc insert-sps-pps=true ! h264parse ! \ rtph264pay pt=96 ! udpsink host=127.0.0.1 port=8001 sync=false -e - Receiving (on target): Network Source + video decode + file dump:: $ gst-launch-1.0 udpsrc address=127.0.0.1 port=8001 \ caps='application/x-rtp, encoding-name=(string)H264, payload=(int)96' ! \ rtph264depay ! queue ! h264parse ! nvv4l2decoder ! nvvidconv ! \ 'video/x-raw, format=(string)I420' ! filesink location=test.yuv -e .. _SD.Multimedia.AcceleratedGstreamer-VideoFormatConversionWithGStreamer10: Video Format Conversion with GStreamer-1.0 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ The NVIDIA proprietary ``nvvidconv`` GStreamer-1.0 plugin allows conversion between OSS (raw) video formats and NVIDIA video formats. The ``nvvidconv`` plugin currently supports the format conversions described in this section. Raw-YUV Input Formats ##################### Currently VIC based ``nvvidconv`` on Jetson supports the ``I420``, ``UYVY``, ``YUY2``, ``YVYU``, ``NV12``, ``NV16``, ``NV24``, ``P010_10LE``, ``GRAY8``, ``BGRx``, ``RGBA``, and ``Y42B RAW-YUV`` input formats and CUDA based ``nvvidconv`` on GPU supports the ``I420``, ``NV12``, ``P010_10LE``, ``GRAY8``, ``BGRx`` and ``RGBA`` input formats. Enter the following commands to perform VIC-based conversion on Jetson Linux: - Using the ``gst-v4l2`` encoder (with other than the GRAY8 pipeline):: $ gst-launch-1.0 videotestsrc ! 'video/x-raw, format=(string)UYVY, \ width=(int)1280, height=(int)720' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! \ nvv4l2h264enc ! 'video/x-h264, \ stream-format=(string)byte-stream' ! h264parse ! \ qtmux ! filesink location=test.mp4 -e - Using the ``gst-v4l2`` encoder with the GRAY8 pipeline:: $ gst-launch-1.0 videotestsrc ! 'video/x-raw, format=(string)GRAY8, \ width=(int)640, height=(int)480, framerate=(fraction)30/1' ! \ nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)I420' ! \ nvv4l2h264enc ! 'video/x-h264, \ stream-format=(string)byte-stream' ! h264parse ! qtmux ! \ filesink location=test.mp4 -e Enter the following commands to perform CUDA-based conversion on an integrated GPU: .. note:: The gst-v4l2 encoder does not support CUDA memory, so the output of the first nvvidconv by using GPU is converted to surface array memory by using VIC. - Using the ``gst-v4l2`` encoder (with other than the GRAY8 pipeline):: $ gst-launch-1.0 videotestsrc ! 'video/x-raw, format=(string)NV12, \ width=(int)1280, height=(int)720' ! nvvidconv compute-hw=GPU \ nvbuf-memory-type=nvbuf-mem-cuda-device ! 'video/x-raw, \ format=(string)I420' ! nvvidconv compute-hw=VIC \ nvbuf-memory-type=nvbuf-mem-surface-array ! 'video/x-raw(memory:NVMM)' ! \ nvv4l2h264enc ! 'video/x-h264, \ stream-format=(string)byte-stream' ! h264parse ! \ qtmux ! filesink location=test.mp4 -e - Using the ``gst-v4l2`` encoder with the GRAY8 pipeline:: $ gst-launch-1.0 videotestsrc ! 'video/x-raw, format=(string)GRAY8, \ width=(int)640, height=(int)480, framerate=(fraction)30/1' ! \ nvvidconv compute-hw=GPU nvbuf-memory-type=nvbuf-mem-cuda-device ! \ 'video/x-raw, format=(string)I420' ! nvvidconv compute-hw=VIC \ nvbuf-memory-type=nvbuf-mem-surface-array ! 'video/x-raw(memory:NVMM)' ! \ nvv4l2h264enc ! 'video/x-h264, \ stream-format=(string)byte-stream' ! h264parse ! qtmux ! \ filesink location=test.mp4 -e Enter the following command to perform CUDA-based format conversion on a dedicated GPU: .. note:: The gst-v4l2 encoder can directly use the CUDA memory on a dedicated GPU. - Using the ``gst-v4l2`` encoder:: $ gst-launch-1.0 filesrc location=input_4k_60p.yuv ! videoparse width=3840 \ height=2160 format=i420 framerate=60 ! nvvidconv compute-hw=GPU \ nvbuf-memory-type=nvbuf-mem-cuda-device ! 'video/x-raw(memory:NVMM), \ width=(int)3840, height=(int)2160, format=(string)I420, framerate=60/1' ! \ nvv4l2h264enc ! 'video/x-h264, stream-format=(string)byte-stream, \ alignment=(string)au' ! h264parse ! qtmux ! \ filesink location=test.mp4 -e .. note:: Format conversion with raw YUV input is CPU-intensive due to the “software to hardware” memory copies involved. Raw-YUV Output Formats ###################### Currently VIC based ``nvvidconv`` on Jetson supports the ``I420``, ``UYVY``, ``YUY2``, ``YVYU``, ``NV12``, ``NV16``, ``NV24``, ``GRAY8``, ``BGRx``, ``RGBA``, and ``Y42B RAW-YUV`` output formats and CUDA based ``nvvidconv`` on GPU supports the ``I420``, ``NV12``, ``P010_10LE``, ``I420_10LE``, ``GRAY8``, ``BGRx`` and ``RGBA`` output formats. Enter the following commands to perform VIC based format conversion on Jetson Linux: - Using the ``gst-v4l2`` decoder (with other than the GRAY8 pipeline):: $ gst-launch-1.0 filesrc location=640x480_30p.mp4 ! qtdemux ! \ queue ! h264parse ! nvv4l2decoder ! nvvidconv ! \ 'video/x-raw, format=(string)UYVY' ! videoconvert ! xvimagesink -e - Using the ``gst-v4l2`` decoder with the GRAY8 pipeline:: $ gst-launch-1.0 filesrc location=720x480_30i_MP.mp4 ! qtdemux ! \ queue ! h264parse ! nvv4l2decoder ! nvvidconv ! 'video/x-raw, \ format=(string)GRAY8' ! videoconvert ! xvimagesink -e Enter the following command to perform CUDA-based format conversion on an integrated GPU: - Using the ``gst-v4l2`` decoder:: $ gst-launch-1.0 filesrc location=640x480_30p.mp4 ! qtdemux ! \ queue ! h264parse ! nvv4l2decoder ! nvvidconv compute-hw=GPU \ nvbuf-memory-type=nvbuf-mem-cuda-device ! nv3dsink -e Enter the following command to perform CUDA-based format conversion on a dedicated GPU: - Using the ``gst-v4l2`` decoder:: $ gst-launch-1.0 filesrc location=720x480_30i_MP.mp4 ! qtdemux ! \ h264parse ! nvv4l2decoder cudadec-memtype=1 ! nvvidconv compute-hw=GPU \ nvbuf-memory-type=nvbuf-mem-cuda-device ! nveglglessink -e .. note:: Format conversion with raw YUV output is CPU-intensive due to the “hardware to software” memory copies involved. NVIDIA Input and Output Formats ############################### Currently CUDA based ``nvvidconv`` on GPU supports the ``I420``, ``NV12``, ``P010_10LE``, ``GRAY8``, ``BGRx`` and ``RGBA`` input formats and supports the ``I420``, ``NV12``, ``P010_10LE``, ``I420_10LE``, ``GRAY8``, ``BGRx`` and ``RGBA`` output formats and VIC based ``nvvidconv`` on Jetson supports the combinations of NVIDIA input and output formats described in the following table. Any format in the column on the left can be converted to any format in the same row in the column on the right. .. raw:: html :file: AcceleratedGstreamer/NvidiaInputAndOutputFormats.htm Enter the following commands to perform VIC-based conversion between NVIDIA formats on Jetson Linux: - Using the ``gst-v4l2`` decoder:: $ gst-launch-1.0 filesrc location=1280x720_30p.mp4 ! qtdemux ! \ h264parse ! nvv4l2decoder ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)RGBA' ! nvdrmvideosink -e - Using the ``gst-v4l2`` encoder:: $ gst-launch-1.0 nvarguscamerasrc ! \ 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \ format=(string)NV12, framerate=(fraction)30/1' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! nvv4l2h264enc ! \ h264parse ! qtmux ! filesink location=test.mp4 -e - Using the ``gst-v4l2`` decoder and nv3dsink with the GRAY8 pipeline:: $ gst-launch-1.0 filesrc location=1280x720_30p.mp4 ! qtdemux ! \ h264parse ! nvv4l2decoder ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)GRAY8' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! nv3dsink -e Enter the following commands to perform CUDA-based conversion between NVIDIA formats on an integrated GPU: - Using the ``gst-v4l2`` decoder:: $ gst-launch-1.0 filesrc location=1280x720_30p.mp4 ! qtdemux ! \ h264parse ! nvv4l2decoder ! nvvidconv compute-hw=GPU \ nvbuf-memory-type=nvbuf-mem-cuda-device ! \ 'video/x-raw(memory:NVMM), format=(string)RGBA' ! nv3dsink -e .. note:: The gst-v4l2 encoder does not support CUDA memory, so the output of the first nvvidconv by using GPU is converted to surface array memory by using VIC. - Using the ``gst-v4l2`` encoder:: $ gst-launch-1.0 nvarguscamerasrc ! \ 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \ format=(string)NV12, framerate=(fraction)30/1' ! \ nvvidconv compute-hw=GPU nvbuf-memory-type=nvbuf-mem-cuda-device ! \ 'video/x-raw, format=(string)I420' ! \ nvvidconv compute-hw=VIC nvbuf-memory-type=nvbuf-mem-surface-array ! \ 'video/x-raw(memory:NVMM)' ! nvv4l2h264enc ! \ h264parse ! qtmux ! filesink location=test.mp4 -e - Using the ``gst-v4l2`` decoder and nv3dsink with the GRAY8 pipeline:: $ gst-launch-1.0 filesrc location=1280x720_30p.mp4 ! qtdemux ! \ h264parse ! nvv4l2decoder ! nvvidconv compute-hw=GPU \ nvbuf-memory-type=nvbuf-mem-cuda-device ! \ 'video/x-raw(memory:NVMM), format=(string)GRAY8' ! \ nvvidconv compute-hw=GPU nvbuf-memory-type=nvbuf-mem-cuda-device ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! nv3dsink -e Enter the following commands to perform CUDA-based conversion between NVIDIA formats on a dedicated GPU: .. note:: The gst-v4l2 encoder can directly use the CUDA memory on a dedicated GPU. - Using the ``gst-v4l2`` encoder:: $ gst-launch-1.0 filesrc location=input_4k_60p_NV12.yuv ! videoparse width=3840 \ height=2160 format=23 framerate=60 ! nvvidconv compute-hw=GPU \ nvbuf-memory-type=nvbuf-mem-cuda-device ! 'video/x-raw(memory:NVMM), \ width=(int)3840, height=(int)2160, format=(string)I420, framerate=60/1' ! \ nvv4l2h264enc ! 'video/x-h264, stream-format=(string)byte-stream, \ alignment=(string)au' ! h264parse ! qtmux ! \ filesink location=test.mp4 -e - Using the ``gst-v4l2`` decoder:: $ gst-launch-1.0 filesrc location=1280x720_30p.mp4 ! qtdemux ! \ h264parse ! nvv4l2decoder cudadec-memtype=1 ! nvvidconv compute-hw=GPU \ nvbuf-memory-type=nvbuf-mem-cuda-device ! 'video/x-raw(memory:NVMM), \ width=1280, height=720, format=(string)I420 ! nveglglessink -e .. _SD.Multimedia.AcceleratedGstreamer-VideoScalingWithGStreamer10: Video Scaling with GStreamer-1.0 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ The NVIDIA proprietary ``nvvidconv`` GStreamer-1.0 plugin also allows you to perform video scaling. The ``nvvidconv`` plugin currently supports scaling with the format conversions described in this section. - Raw-YUV input formats: Currently VIC based ``nvvidconv`` on Jetson supports the I420, UYVY, YUY2, YVYU, NV12, NV16, NV24, P010_10LE, GRAY8, BGRx, RGBA, and Y42B RAW-YUV input formats for scaling and CUDA based ``nvvidconv`` on GPU supports the I420, NV12, P010_10LE, GRAY8, BGRx and RGBA input formats for scaling. - Using the ``gst-v4l2`` encoder and perform VIC based scaling on Jetson Linux:: $ gst-launch-1.0 videotestsrc ! \ 'video/x-raw, format=(string)I420, width=(int)1280, \ height=(int)720' ! nvvidconv ! \ 'video/x-raw(memory:NVMM), width=(int)640, height=(int)480, \ format=(string)I420' ! nvv4l2h264enc ! \ 'video/x-h264, stream-format=(string)byte-stream' ! h264parse ! \ qtmux ! filesink location=test.mp4 -e .. note:: The gst-v4l2 encoder does not support CUDA memory, so the output of the first nvvidconv by using GPU is converted to surface array memory by using VIC. - Using the ``gst-v4l2`` encoder and perform CUDA-based scaling on an integrated GPU:: $ gst-launch-1.0 videotestsrc ! \ 'video/x-raw, format=(string)I420, width=(int)1280, \ height=(int)720' ! nvvidconv compute-hw=GPU \ nvbuf-memory-type=nvbuf-mem-cuda-device ! \ 'video/x-raw, width=(int)640, height=(int)480, \ format=(string)I420' ! nvvidconv compute-hw=VIC \ nvbuf-memory-type=nvbuf-mem-surface-array ! \ 'video/x-raw(memory:NVMM)' ! nvv4l2h264enc ! \ 'video/x-h264, stream-format=(string)byte-stream' ! h264parse ! \ qtmux ! filesink location=test.mp4 -e .. note:: The gst-v4l2 encoder can directly use the CUDA memory on a dedicated GPU. - Using the ``gst-v4l2`` encoder and perform CUDA-based scaling on a dedicated GPU:: $ gst-launch-1.0 filesrc location=input_4k_60p.yuv ! videoparse width=3840 \ height=2160 format=i420 framerate=60 ! nvvidconv compute-hw=GPU \ nvbuf-memory-type=nvbuf-mem-cuda-device ! 'video/x-raw(memory:NVMM), \ width=(int)1920, height=(int)1080, format=(string)I420, framerate=60/1' ! \ nvv4l2h264enc ! 'video/x-h264, stream-format=(string)byte-stream, \ alignment=(string)au' ! h264parse ! qtmux ! \ filesink location=test.mp4 -e .. note:: Video scaling with raw YUV input is CPU-intensive due to the “software to hardware” memory copies involved. - Raw-YUV Output Formats: Currently VIC based ``nvvidconv`` on Jetson supports the I420, UYVY, YUY2, YVYU, NV12, NV16, NV24, GRAY8, BGRx, RGBA, and Y42B RAW-YUV output formats for scaling and CUDA based ``nvvidconv`` on GPU supports the I420, NV12, GRAY8, BGRx, RGBA, and I420_10LE output formats for scaling. - Using the ``gst-v4l2`` decoder and perform VIC based scaling on Jetson Linux:: $ gst-launch-1.0 filesrc location=1280x720_30p.mp4 ! qtdemux ! \ queue ! h264parse ! nvv4l2decoder ! nvvidconv ! \ 'video/x-raw, format=(string)I420, width=640, height=480' ! \ xvimagesink -e - Using the ``gst-v4l2`` decoder and perform CUDA-based scaling on an integrated GPU:: $ gst-launch-1.0 filesrc location=1280x720_30p.mp4 ! qtdemux ! \ queue ! h264parse ! nvv4l2decoder ! nvvidconv compute-hw=GPU \ nvbuf-memory-type=nvbuf-mem-cuda-device ! \ 'video/x-raw, format=(string)I420, width=640, height=480' ! \ nv3dsink -e - Using the ``gst-v4l2`` decoder and perform CUDA-based scaling on a dedicated GPU:: $ gst-launch-1.0 filesrc location = 1280x720_30p.mp4 ! qtdemux ! \ h264parse ! nvv4l2decoder cudadec-memtype=1 ! nvvidconv compute-hw=GPU \ nvbuf-memory-type=nvbuf-mem-cuda-device ! 'video/x-raw(memory:NVMM), \ width=640, height=480' ! nveglglessink -e .. note:: Video scaling with raw YUV output is CPU-intensive due to the “hardware to software” memory copies involved. .. _SD.Multimedia.AcceleratedGstreamer-VideoCroppingWithGstreamer10: Video Cropping with GStreamer-1.0 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ The NVIDIA proprietary ``nvvidconv`` GStreamer-1.0 plugin also allows you to perform video cropping: - Using the ``gst-v4l2`` decoder and perform VIC based cropping on Jetson Linux:: $ gst-launch-1.0 filesrc location= ! qtdemux ! \ h264parse ! nvv4l2decoder ! \ nvvidconv left=400 right=1520 top=200 bottom=880 ! nv3dsink -e - Using the ``gst-v4l2`` decoder and perform CUDA-based cropping on an integrated GPU:: $ gst-launch-1.0 filesrc location= ! qtdemux ! \ h264parse ! nvv4l2decoder ! nvvidconv compute-hw=GPU \ nvbuf-memory-type=nvbuf-mem-cuda-device \ left=400 right=1520 top=200 bottom=880 ! nv3dsink -e - Using the ``gst-v4l2`` decoder and perform CUDA-based cropping on a dedicated GPU:: $ gst-launch-1.0 filesrc location= ! qtdemux ! \ h264parse ! nvv4l2decoder cudadec-memtype=1 ! nvvidconv compute-hw=GPU \ nvbuf-memory-type=nvbuf-mem-cuda-device \ left=400 right=1520 top=200 bottom=880 ! nveglglessink -e Video Transcode with GStreamer-1.0 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ You can perform video transcoding between the following video formats. - H.264 decode to VP9 Encode (NVIDIA accelerated decode to NVIDIA accelerated encode): - Using the ``gst-v4l2`` pipeline:: $ gst-launch-1.0 filesrc location= ! qtdemux ! \ h264parse ! nvv4l2decoder ! nvv4l2vp9enc !matroskamux name=mux ! \ filesink location= -e - H.265 decode to VP9 encode (NVIDIA accelerated decode to NVIDIA accelerated encode): - Using the ``gst-v4l2`` pipeline:: $ gst-launch-1.0 filesrc location= ! \ qtdemux name=demux demux.video_0 ! queue ! h265parse ! nvv4l2decoder ! \ nvv4l2vp9enc bitrate=20000000 ! queue ! matroskamux name=mux ! \ filesink location= -e - VP8 decode to H.264 encode (NVIDIA accelerated decode to NVIDIA accelerated encode): - Using the ``gst-v4l2`` pipeline:: $ gst-launch-1.0 filesrc location= ! \ matroskademux name=demux demux.video_0 ! queue ! nvv4l2decoder ! \ nvv4l2h264enc bitrate=20000000 ! h264parse ! queue ! \ qtmux name=mux ! filesink location= -e - VP9 decode to H.265 encode (NVIDIA accelerated decode to NVIDIA accelerated encode): - Using the ``gst-v4l2`` pipeline:: $ gst-launch-1.0 filesrc location= ! \ matroskademux name=demux demux.video_0 ! queue ! nvv4l2decoder ! \ nvv4l2h265enc bitrate=20000000 ! h265parse ! queue ! \ qtmux name=mux ! filesink location= -e - MPEG-4 decode to VP9 encode (NVIDIA accelerated decode to NVIDIA accelerated encode): - Using the ``gst-v4l2`` pipeline:: $ gst-launch-1.0 filesrc location= ! \ qtdemux name=demux demux.video_0 ! queue ! mpeg4videoparse ! \ nvv4l2decoder ! nvv4l2vp9enc bitrate=20000000 ! queue ! \ matroskamux name=mux ! filesink \ location= -e - MPEG-4 decode to H.264 encode (NVIDIA accelerated decode to NVIDIA accelerated encode): - Using the ``gst-v4l2`` pipeline:: $ gst-launch-1.0 filesrc location= ! \ qtdemux name=demux demux.video_0 ! queue ! mpeg4videoparse ! \ nvv4l2decoder ! nvv4l2h264enc bitrate=20000000 ! h264parse ! \ queue ! qtmux name=mux ! filesink \ location= -e - H.264 decode to AV1 encode (NVIDIA accelerated decode to NVIDIA accelerated encode): - Using the ``gst-v4l2`` pipeline:: $ gst-launch-1.0 filesrc location= ! \ qtdemux name=demux demux.video_0 ! queue ! h264parse ! \ nvv4l2decoder ! nvv4l2av1enc bitrate=20000000 ! queue ! \ matroskamux name=mux ! \ filesink location= -e - H.265 decode to AV1 encode (NVIDIA accelerated decode to NVIDIA accelerated encode): - Using the ``gst-v4l2`` pipeline:: $ gst-launch-1.0 filesrc location= ! \ qtdemux name=demux demux.video_0 ! queue ! h265parse ! \ nvv4l2decoder ! nvv4l2av1enc bitrate=20000000 ! queue ! \ matroskamux name=mux ! \ filesink location= -e - VP8 decode to MPEG-4 encode (NVIDIA accelerated decode to OSS software encode): - Using the ``gst-v4l2`` pipeline:: $ gst-launch-1.0 filesrc location= ! \ matroskademux name=demux demux.video_0 ! queue ! nvv4l2decoder ! \ nvvidconv ! avenc_mpeg4 bitrate=4000000 ! queue ! \ qtmux name=mux ! filesink location= -e - VP9 decode to MPEG-4 encode (NVIDIA accelerated decode to OSS software encode): - Using the ``gst-v4l2`` pipeline:: $ gst-launch-1.0 filesrc location= ! \ matroskademux name=demux demux.video_0 ! queue ! nvv4l2decoder ! \ nvvidconv ! avenc_mpeg4 bitrate=4000000 ! qtmux name=mux ! \ filesink location= -e - H.264 decode to Theora encode (NVIDIA accelerated decode to OSS software encode): - Using the ``gst-v4l2`` pipeline:: $ gst-launch-1.0 filesrc location= ! \ qtdemux name=demux demux.video_0 ! queue ! h264parse ! \ nvv4l2decoder ! nvvidconv ! theoraenc bitrate=4000000 ! \ oggmux name=mux ! filesink location= -e - H.264 decode to H.263 encode (NVIDIA accelerated decode to OSS software encode): - Using the ``gst-v4l2`` pipeline:: $ gst-launch-1.0 filesrc location= ! \ qtdemux name=demux demux.video_0 ! queue ! h264parse ! \ nvv4l2decoder ! nvvidconv ! \ ' video/x-raw, width=(int)704, height=(int)576, \ for mat=(string)I420' ! avenc_h263 bitrate=4000000 ! qtmux ! \ files ink location= -e CUDA Video Post-Processing with GStreamer-1.0 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ This section describes GStreamer-1.0 plugins for |NVIDIA(r)| CUDA\ |reg| post-processing operations. gst-nvivafilter ############### This NVIDIA proprietary GStreamer-1.0 plugin performs pre/post and CUDA post-processing operations on CSI camera captured or decoded frames, and renders video using overlay video sink or video encode. .. note:: The ``gst-nvivafilter`` pipeline requires unsetting the ``DISPLAY`` environment variable using the command ``unset DISPLAY`` if ``lightdm`` is stopped. - Sample decode pipeline: - Using the ``gst-v4l2`` decoder:: $ gst-launch-1.0 filesrc location= ! qtdemux ! queue ! \ h264parse ! nvv4l2decoder ! nvivafilter cuda-process=true \ customer-lib-name="libnvsample_cudaprocess.so" ! \ 'video/x-raw(memory:NVMM), format=(string)NV12' ! \ nvdrmvideosink -e - Sample CSI camera pipeline:: $ gst-launch-1.0 nvarguscamerasrc ! \ 'video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, \ format=(string)NV12, framerate=(fraction)30/1' ! \ nvivafilter cuda-process=true \ customer-lib-name="libnvsample_cudaprocess.so" ! \ 'video/x-raw(memory:NVMM), format=(string)NV12' ! nv3dsink -e .. note:: See ``nvsample_cudaprocess_src.tbz2`` for the ``libnvsample_cudaprocess.so`` library sources. The sample CUDA implementation of ``libnvsample_cudaprocess.so`` can be replaced by a custom CUDA implementation. .. _SD.Multimedia.AcceleratedGstreamer-VideoRotationWithGstreamer10: Video Rotation with GStreamer-1.0 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ The NVIDIA proprietary nvvidconv GStreamer-1.0 plugin also allows you to perform video rotation operations. The following table shows the supported values for the ``nvvidconv`` ``flip-method`` property. ================================ ============================== Flip method ``flip-method`` property value ================================ ============================== Identity (no rotation. default) 0 Counterclockwise 90 degrees 1 Rotate 180 degrees 2 Clockwise 90 degrees 3 Horizontal flip 4 Upper right diagonal flip 5 Vertical flip 6 Upper left diagonal flip 7 ================================ ============================== .. note:: To get information on the nvvidconv flip-method property, enter the command:: $ gst-inspect-1.0 nvvidconv - To rotate the video 90 degrees counterclockwise: - With ``gst-v4l2`` decoder and perform VIC based rotation on Jetson Linux:: $ gst-launch-1.0 filesrc location= ! \ qtdemux name=demux ! h264parse ! nvv4l2decoder ! \ nvvidconv flip-method=1 ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! \ nvdrmvideosink -e - With ``gst-v4l2`` decoder and perform CUDA-based rotation on an integrated GPU:: $ gst-launch-1.0 filesrc location= ! \ qtdemux ! h264parse ! nvv4l2decoder ! \ nvvidconv compute-hw=GPU nvbuf-memory-type=nvbuf-mem-cuda-device \ flip-method=1 ! 'video/x-raw(memory:NVMM), format=(string)I420' ! \ nv3dsink -e - With ``gst-v4l2`` decoder and perform CUDA-based rotation on a dedicated GPU:: $ gst-launch-1.0 filesrc location= ! \ qtdemux ! h264parse ! nvv4l2decoder cudadec-memtype=1 ! \ nvvidconv compute-hw=GPU nvbuf-memory-type=nvbuf-mem-cuda-device \ flip-method=1 ! 'video/x-raw(memory:NVMM)' ! \ nveglglessink -e - To rotate the video 90 degrees clockwise: - With ``gst-v4l2`` decoder and perform VIC based rotation on Jetson Linux:: $ gst-launch-1.0 filesrc location= ! \ qtdemux ! h264parse ! nvv4l2decoder ! \ nvvidconv flip-method=3 ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! \ nvdrmvideosink -e - With ``gst-v4l2`` decoder and perform CUDA-based rotation on an integrated GPU:: $ gst-launch-1.0 filesrc location= ! \ qtdemux ! h264parse ! nvv4l2decoder ! \ nvvidconv flip-method=3 compute-hw=GPU \ nvbuf-memory-type=nvbuf-mem-cuda-device ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! \ nv3dsink -e - To rotate 180 degrees: - With ``nvarguscamerasrc`` and perform VIC based rotation on Jetson Linux:: $ gst-launch-1.0 nvarguscamerasrc ! \ 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \ format=(string)NV12, framerate=(fraction)30/1' ! \ nvvidconv flip-method=2 ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! nv3dsink -e - With ``nvarguscamerasrc`` and perform CUDA-based rotation on an integrated GPU:: $ gst-launch-1.0 nvarguscamerasrc ! \ 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \ format=(string)NV12, framerate=(fraction)30/1' ! \ nvvidconv flip-method=2 compute-hw=GPU \ nvbuf-memory-type=nvbuf-mem-cuda-device ! \ 'video/x-raw(memory:NVMM), format=(string)I420' ! nv3dsink -e - To scale and rotate the video 90 degrees counterclockwise: - Using the ``gst-v4l2`` decoder and perform VIC based rotation on Jetson Linux:: $ gst-launch-1.0 filesrc location= ! qtdemux ! \ h264parse ! nvv4l2decoder ! nvvidconv flip-method=1 ! \ 'video/x-raw(memory:NVMM), width=(int)480, height=(int)640, \ format=(string)I420' ! nvdrmvideosink -e - Using the ``gst-v4l2`` decoder and perform CUDA-based rotation on an integrated GPU:: $ gst-launch-1.0 filesrc location= ! qtdemux ! \ h264parse ! nvv4l2decoder ! nvvidconv flip-method=1 \ compute-hw=GPU nvbuf-memory-type=nvbuf-mem-cuda-device ! \ 'video/x-raw(memory:NVMM), width=(int)480, height=(int)640, \ format=(string)I420' ! nv3dsink -e - To scale and rotate the video 90 degrees clockwise: - With ``nvarguscamerasrc`` and perform VIC based rotation on Jetson Linux:: $ gst-launch-1.0 nvarguscamerasrc ! \ 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \ format=(string)NV12, framerate=(fraction)30/1' ! \ nvvidconv flip-method=3 ! 'video/x-raw(memory:NVMM), \ width=(int)480, height=(int)640, format=(string)I420' ! \ nv3dsink -e - With ``nvarguscamerasrc`` and perform CUDA-based rotation on an integrated GPU:: $ gst-launch-1.0 nvarguscamerasrc ! \ 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \ format=(string)NV12, framerate=(fraction)30/1' ! \ nvvidconv flip-method=3 compute-hw=GPU \ nvbuf-memory-type=nvbuf-mem-cuda-device ! 'video/x-raw(memory:NVMM), \ width=(int)480, height=(int)640, format=(string)I420' ! \ nv3dsink -e .. todo:: The original's examples referred to both "the gst-omx-decoder" and "the gst-omx decoder." Which one is correct? The same issue applies to ``gst-v4l2`` decoder (or gst-v4l2-decoder). - To scale and rotate the video 180 degrees: - Using the ``gst-v4l2`` decoder and perform VIC based rotation on Jetson Linux:: $ gst-launch-1.0 filesrc location= ! \ qtdemux ! h264parse ! nvv4l2decoder ! nvvidconv flip-method=2 ! \ 'video/x-raw(memory:NVMM), width=(int)640, height=(int)480, \ format=(string)I420' ! nvdrmvideosink -e - Using the ``gst-v4l2`` decoder and perform CUDA-based rotation on an integrated GPU:: $ gst-launch-1.0 filesrc location= ! \ qtdemux ! h264parse ! nvv4l2decoder ! nvvidconv flip-method=2 \ compute-hw=GPU nvbuf-memory-type=nvbuf-mem-cuda-device ! \ 'video/x-raw(memory:NVMM), width=(int)640, height=(int)480, \ format=(string)I420' ! nv3dsink -e Video Composition with GStreamer-1.0 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ With the NVIDIA proprietary nvcompositor GStreamer-1.0 plugin, you can perform video composition operations on camera and gst-v4l2 video decoded streams. To composite decoded streams with different formats ################################################### - Using the ``gst-v4l2`` decoder:: $ gst-launch-1.0 nvcompositor \ name=comp sink_0::xpos=0 sink_0::ypos=0 sink_0::width=1920 \ sink_0::height=1080 sink_1::xpos=0 sink_1::ypos=0 \ sink_1::width=1600 sink_1::height=1024 sink_2::xpos=0 \ sink_2::ypos=0 sink_2::width=1366 sink_2::height=768 \ sink_3::xpos=0 sink_3::ypos=0 sink_3::width=1024 \ sink_3::height=576 ! 'video/x-raw(memory:NVMM)' ! nv3dsink \ filesrc location= ! qtdemux ! \ h264parse ! nvv4l2decoder ! comp. filesrc \ location= ! qtdemux ! h265parse ! \ nvv4l2decoder ! comp. filesrc \ location= ! matroskademux ! \ nvv4l2decoder ! comp. filesrc \ location= ! \ matroskademux ! nvv4l2decoder ! comp. -e To composite different camera feeds ################################### - Using the ``nvarguscamerasrc``:: $ gst-launch-1.0 nvcompositor \ name=comp sink_0::xpos=960 sink_0::ypos=540 sink_0::width=960 \ sink_0::height=540 sink_1::width=1920 sink_1::height=1080 ! \ 'video/x-raw(memory:NVMM)' ! queue ! nv3dsink \ nvarguscamerasrc sensor-id=0 ! \ 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \ format=(string)NV12, framerate=30/1' ! comp. \ nvarguscamerasrc sensor-id=1 ! \ 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, \ format=(string)NV12, framerate=30/1' ! comp. -e Interpolation Methods for Video Scaling @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ The NVIDIA proprietary ``nvvidconv`` GStreamer-1.0 plugin allows you to choose the interpolation method used for scaling. The following table shows the supported values for the VIC based ``nvvidconv`` ``interpolation-method`` property on Jetson. ======================== ======================================= Interpolation method ``interpolation-method`` property value ======================== ======================================= Nearest (default) 0 Bilinear 1 5-tap 2 10-tap 3 Smart 4 Nicest 5 ======================== ======================================= The following table shows the supported values for the CUDA based ``nvvidconv`` ``interpolation-method`` property on GPU. ======================== ======================================= Interpolation method ``interpolation-method`` property value ======================== ======================================= Nearest (default) 0 Bilinear 1 Cubic 2 Super 3 Lanczos 4 ======================== ======================================= .. note:: To display information about the ``nvvidconv`` interpolation-method property, enter the command:: $ gst-inspect-1.0 nvvidconv To use bilinear interpolation method for scaling ################################################ - Using the ``gst-v4l2`` pipeline and perform VIC based scaling on Jetson Linux:: $ gst-launch-1.0 filesrc location=! \ qtdemux name=demux ! h264parse ! nvv4l2decoder ! \ nvvidconv interpolation-method=1 ! \ 'video/x-raw(memory:NVMM), format=(string)I420, width=1280, \ height=720' ! nvdrmvideosink -e - Using the ``gst-v4l2`` pipeline and perform CUDA-based scaling on an integrated GPU:: $ gst-launch-1.0 filesrc location= ! \ qtdemux name=demux ! h264parse ! nvv4l2decoder ! \ nvvidconv interpolation-method=1 compute-hw=GPU \ nvbuf-memory-type=nvbuf-mem-cuda-device ! \ 'video/x-raw(memory:NVMM), format=(string)I420, width=1280, \ height=720' ! nv3dsink -e - Using the ``gst-v4l2`` pipeline and perform CUDA-based scaling on a dedicated GPU:: $ gst-launch-1.0 filesrc location= ! \ qtdemux ! h264parse ! nvv4l2decoder cudadec-memtype=1 ! \ nvvidconv interpolation-method=1 compute-hw=GPU \ nvbuf-memory-type=nvbuf-mem-cuda-device ! \ 'video/x-raw(memory:NVMM), format=(string)NV12, width=1280, \ height=720' ! nveglglessink -e EGLStream Producer Example @@@@@@@@@@@@@@@@@@@@@@@@@@ The NVIDIA proprietary ``nveglstreamsrc`` and ``nvvideosink`` GStreamer-1.0 plugins allow simulation of an EGLStream producer pipeline (for preview only.) To simulate an EGLStream producer pipeline, enter the command:: $ nvgstcapture-1.0 --camsrc=3 EGL Image Transform Example @@@@@@@@@@@@@@@@@@@@@@@@@@@ The NVIDIA proprietary ``nvegltransform`` GStreamer-1.0 plugin allows simulation of an EGLImage transform pipeline. To simulate an EGL Image transform pipeline: - Using the ``gst-v4l2`` pipeline:: $ gst-launch-1.0 filesrc location= ! \ qtdemux ! h264parse ! nvv4l2decoder ! nvegltransform ! nveglglessink -e GStreamer Build Instructions @@@@@@@@@@@@@@@@@@@@@@@@@@@@ Use the ``gst-install`` script to install a specific GStreamer version. This section provides a procedure for building current versions of GStreamer. To build GStreamer using gst-install #################################### 1. Run the command:: $ gst-install [--prefix=] [--version=] Where: - ```` is the location where GStreamer is to be installed. - ```` is the GStreamer version to be installed. 2. Run the commands:: $ export LD_LIBRARY_PATH=/lib/aarch64-linux-gnu $ export PATH=/bin:$PATH Where ```` is the location GStreamer has been installed. For example:: $ gst-install --prefix=/home/ubuntu/gst-1.16.2 --version=1.16.2 $ export LD_LIBRARY_PATH=/home/ubuntu/gst-1.16.2/lib/aarch64-linux-gnu % export PATH=/home/ubuntu/gst-1.16.2/bin:$PATH To build GStreamer manually ########################### 1. Download the latest version of GStreamer, available from the `freedesktop.org GStreamer source directory `__. You need the following files from version 1.16.2: - ``gstreamer-1.16.2.tar.xz`` - ``gst-plugins-base-1.16.2.tar.xz`` - ``gst-plugins-good-1.16.2.tar.xz`` - ``gst-plugins-bad-1.16.2.tar.xz`` - ``gst-plugins-ugly-1.16.2.tar.xz`` .. todo:: First, up to now the whole topic has concerned GStreamer 1.0 and 1.14; now we're giving instructions for building GStreamer version 1.16.2, which (although we do not explicitly say so) appears to be the current version. There's no explanation of the differences between 1.1/1.14 and 1.16.2. Nor are there instructions for making GStreamer applications work with v1.16.2 or cautions about things you can't do or must do when using it. It seems logical that if the reader is going to bother building GStreamer themselves, they should get something they didn't already have, and the current release is a logical thing for them to get. But it also seems logical that if there are no compatibility issues worth mentioning, we might as well give them a package containing v1.16.2 and save them the trouble. Second, according to the `GStreamer Releases page `__, the latest stable release is 1.18.4, not 1.16.2. Third, the source files at the URL given above are not packed in archives. The files the reader is told they need are not there at all, unless they're buried in some unidentified subdirectory. 2. To install required packages, enter the command:: $ sudo apt-get install build-essential dpkg-dev flex bison \ autotools-dev automake liborc-dev autopoint libtool \ gtk-doc-tools libgstreamer1.0-dev 3. In the home (``~``) directory, create a subdirectory named ``gst_``, where ```` is the version number of GStreamer you are building. 4. Copy the downloaded ``.tar.xz`` files to the ``gst_`` directory. 5. Uncompress the ``.tar.xz`` files in the ``gst_`` directory. 6. Set the environment variable ``PKG_CONFIG_PATH``. by entering the command:: $ export PKG_CONFIG_PATH=/home/ubuntu/gst_1.16.2/out/lib/pkgconfig 7. Build GStreamer (in this example, ``gstreamer-1.16.2``) by entering the commands:: $ ./configure --prefix=/home/ubuntu/gst_1.16.2/out $ make $ make install 8. Build ``gst-plugins-base-1.16.2`` by entering the commands:: $ sudo apt-get install libxv-dev libasound2-dev libtheora-dev \ libogg-dev libvorbis-dev $ ./configure --prefix=/home/ubuntu/gst_1.16.2/out $ make $ make install 9. Build ``gst-plugins-good-1.16.2`` by entering the commands:: $ sudo apt-get install libbz2-dev libv4l-dev libvpx-dev \ libjack-jackd2-dev libsoup2.4-dev libpulse-dev $ ./configure --prefix=/home/ubuntu/gst_1.16.2/out $ make $ make install 10. Obtain and build ``gst-plugins-bad-1.16.2`` by entering the commands:: $ sudo apt-get install faad libfaad-dev libfaac-dev $ ./configure --prefix=/home/ubuntu/gst_1.16.2/out $ make $ make install 11. Obtain and build ``gst-plugins-ugly-1.16.2`` by entering the commands:: $ sudo apt-get install libx264-dev libmad0-dev $ ./configure --prefix=/home/ubuntu/gst_1.16.2/out $ make $ make install 12. Set the environment variable ``LD_LIBRARY_PATH`` by entering the command:: $ export LD_LIBRARY_PATH=/home/ubuntu/gst_1.16.2/out/lib/ 13. Copy the NVIDIA ``gstreamer-1.0`` libraries to the ``gst_1.16.2`` plugin directory by entering the commands:: $ cd /usr/lib/aarch64-linux-gnu/gstreamer-1.0/ $ cp libgstnv\* ~/gst_1.16.2/out/lib/gstreamer-1.0/ .. todo:: The term "NVIDIA gstreamer-1.0" is puzzling. In the source document it's ``nvidia gstreamer-1.0``, which makes it look like a filename with a space in it. Assuming it's not that, is it "NVIDIA ``gstreamer-1.0``," where NVIDIA identifies the provider and ``gstreamer-1.0`` is a filename or directory name? Or is it "NVIDIA GStreamer 1.0," where GStreamer is the name of the framework and 1.0 is the release number? The NVIDIA ``gstreamer-1.0`` libraries include: - ``libgstnvarguscamera.so`` - ``libgstnvv4l2camerasrc.so`` - ``libgstnvcompositor.so`` - ``libgstnvdrmvideosink.so`` - ``libgstnveglglessink.so`` - ``libgstnveglstreamsrc.so`` - ``libgstnvegltransform.so`` - ``libgstnvivafilter.so`` - ``libgstnvjpeg.so`` - ``libgstnvtee.so`` - ``libgstnvvidconv.so`` - ``libgstnvvideo4linux2.so`` - ``libgstnvvideocuda.so`` - ``libgstnvvideosink.so`` - ``libgstnvvideosinks.so`` .. _SD.Multimedia.AcceleratedGstreamer-Nvgstcapture10Reference: nvgstcapture-1.0 Reference @@@@@@@@@@@@@@@@@@@@@@@@@@ This section describes the nvgstcapture-1.0 application. .. note:: By default, ``nvgstcapture-1.0`` only supports the ARGUS API using the nvarguscamerasrc plugin. The legacy ``nvcamerasrc`` plugin is no longer supported. Command Line Options #################### To display command usage information, run ``nvgstcapture-1.0`` with one of these command line options:one of these command line options: - ``-h`` or ``--help``: Shows command line options except for GStreamer options. - ``--help-all``: Shows all command line options. - ``--help-get``: Shows GStreamer command line options. The following table describes the application’s other command-line options: .. raw:: html :file: AcceleratedGstreamer/Nvgstcapture10CommandLineOptions.htm CSI Camera Supported Resolutions ################################ CSI camera supports the following image resolutions for Nvarguscamera: .. todo:: This is ungrammatical, and it's not clear what it means. Some possibilities: "A CSI camera," "The CSI camera," "The CSI camera driver." - 640\ |times|\ 480 - 1280\ |times|\ 720 - 1920\ |times|\ 1080 - 2104\ |times|\ 1560 - 2592\ |times|\ 1944 - 2616\ |times|\ 1472 - 3840\ |times|\ 2160 - 3896\ |times|\ 2192 - 4208\ |times|\ 3120 - 5632\ |times|\ 3168 - 5632\ |times|\ 4224 CSI Camera Runtime Commands ########################### Options for Nvarguscamera $$$$$$$$$$$$$$$$$$$$$$$$$ .. todo:: It's not clear what this subheading means, since the heading seems to describe the contents accurately (it's a table of commands, not options) and there are no others. The following text says "commands options," which appears to mean "commands." As a related issue, our style guidelines do not allow a subheading to follow a heading without intervening text. If both headings are needed, we need text between them. As a separate issue, is the name really "Nvarguscamera," with an initial cap? This is very unusual for a program name. Maybe it's the proper name (distinguished from the filename) of a library or API, but the fact that it starts with the letters 'Nv', and the fact that it's not a pronounceable word, argue against that. As another separate issue, are we talking about "Nvarguscamera," or "CSI camera"? Maybe the two terms are equivalent, but we should use one consistently. The following table describes CSI camera runtime command line options for ``Nvarguscamera``. +--------------------------------------------------------------------------+ | Nvarguscamera command line options | +-------------+--------------------------+---------------------------------+ | Command | Description | Value and examples | +=============+==========================+=================================+ | h | Help. | |mdash| | +-------------+--------------------------+---------------------------------+ | q | Quit. | |mdash| | +-------------+--------------------------+---------------------------------+ | mo: | Set capture mode. | 1: image | | | | | | | | 2: video | +-------------+--------------------------+---------------------------------+ | gmo | Get capture mode. | |mdash| | +-------------+--------------------------+---------------------------------+ | so: | Set sensor orientation. | 0: none | | | | | | | | 1: rotate counter-clockwise | | | | 90\ |deg| | | | | | | | | 2: rotate 180\ |deg| | | | | | | | | 3: rotate clockwise 90\ |deg| | +-------------+--------------------------+---------------------------------+ | gso | Get sensor orientation. | |mdash| | +-------------+--------------------------+---------------------------------+ | wb: | Set white balance mode. | 0: off | | | | | | | | 1: auto | | | | | | | | 2: incandescent | | | | | | | | 3: fluorescent | | | | | | | | 4: warm-fluorescent | | | | | | | | 5: daylight | | | | | | | | 6: cloudy-daylight | | | | | | | | 7: twilight | | | | | | | | 8: shade | | | | | | | | 9: manual | +-------------+--------------------------+---------------------------------+ | gwb | Get white balance mode. | |mdash| | +-------------+--------------------------+---------------------------------+ | st: | Set saturation. | 0-2 | | | | | | | | Example: ``st:1.25`` | +-------------+--------------------------+---------------------------------+ | gst | Get saturation. | |mdash| | +-------------+--------------------------+---------------------------------+ | j | Capture one image. | |mdash| | +-------------+--------------------------+---------------------------------+ | jx | Capture after a delay of | |mdash| | | | ```` seconds. | | | | | Example: ``jx5000`` for a | | | | 5\ |nbsp|\ second delay. | +-------------+--------------------------+---------------------------------+ | j: | Capture ```` | |mdash| | | | images in succession. | | | | | Example: ``j:6`` to capture | | | | 6\ |nbsp|\ images. | +-------------+--------------------------+---------------------------------+ | 0 | Stop recording video. | |mdash| | +-------------+--------------------------+---------------------------------+ | 1 | Start recording video. | |mdash| | +-------------+--------------------------+---------------------------------+ | 2 | Video snapshot (while | |mdash| | | | recording video). | | +-------------+--------------------------+---------------------------------+ | gpcr | Get preview resolution. | |mdash| | +-------------+--------------------------+---------------------------------+ | gicr | Get image capture | |mdash| | | | resolution. | | +-------------+--------------------------+---------------------------------+ | gvcr | Get video capture | |mdash| | | | resolution. | | +-------------+--------------------------+---------------------------------+ USB Camera Runtime Commands ########################### The following table describes USB camera runtime commands. +------------------------------------------------------------------------------+ | USB camera runtime commands | +-------------+-----------------------------------+----------------------------+ | Command | Description | Value and examples | +=============+===================================+============================+ | h | Help. | |mdash| | +-------------+-----------------------------------+----------------------------+ | q | Quit. | |mdash| | +-------------+-----------------------------------+----------------------------+ | mo: | Set capture mode. | 1: image | | | | | | | | 2: video | +-------------+-----------------------------------+----------------------------+ | gmo | Get capture mode. | |mdash| | +-------------+-----------------------------------+----------------------------+ | j | Capture one image. | |mdash| | +-------------+-----------------------------------+----------------------------+ | jx | Capture after a delay of | |mdash| | | | ```` milliseconds. | | | | | Example: ``jx5000`` to | | | | capture after a | | | | 5000\ |nbsp|\ millisecond | | | | (5\ |nbsp|\ second) delay. | +-------------+-----------------------------------+----------------------------+ | j: | Capture ```` images in | |mdash| | | | succession. | | | | | Example: ``j:6`` to capture| | | | 6\ |nbsp|\ images. | +-------------+-----------------------------------+----------------------------+ | 1 | Start recording video. | |mdash| | +-------------+-----------------------------------+----------------------------+ | 0 | Stop recording video. | |mdash| | +-------------+-----------------------------------+----------------------------+ | pcr: | Set preview resolution. | 0: 176\ |times|\ 144 | | | | | | | | 1: 320\ |times|\ 240 | | | | | | | | 2: 640\ |times|\ 480 | | | | | | | | 3: 1280\ |times|\ 720 | +-------------+-----------------------------------+----------------------------+ | gpcr | Get preview resolution. | |mdash| | +-------------+-----------------------------------+----------------------------+ | gicr | Get image capture resolution. | |mdash| | +-------------+-----------------------------------+----------------------------+ | gvcr | Get video capture resolution. | |mdash| | +-------------+-----------------------------------+----------------------------+ | br: | Set encoding bit rate in bytes. | Example: ``br:4000000`` | +-------------+-----------------------------------+----------------------------+ | gbr | Get encoding bit rate. | |mdash| | +-------------+-----------------------------------+----------------------------+ | cdn: | Set capture device node. | 0: ``//dev/video0`` | | | | | | | | 1: ``//dev/video1`` | | | | | | | | 2: ``//dev/video2`` | +-------------+-----------------------------------+----------------------------+ | gcdn | Get capture device node. | |mdash| | +-------------+-----------------------------------+----------------------------+ Runtime Video Encoder Configuration Options ########################################### The following table describes runtime video encoder configuration options supported for ``Nvarguscamera``. +------------------------------------------------------------------------+ | Runtime video encoder options | +-------------+--------------------------------+-------------------------+ | Command | Description | Value and examples | +=============+================================+=========================+ | br: | Sets encoding bit-rate in | Example: ``br:4000000`` | | | bytes. | | +-------------+--------------------------------+-------------------------+ | gbr | Gets encoding bit-rate in | |mdash| | | | bytes. | | +-------------+--------------------------------+-------------------------+ | ep: | Sets encoding profile (for | 0: baseline | | | H.264 only). | | | | | 1: main | | | | | | | | 2: high | | | | | | | | Example: ``ep:1`` | +-------------+--------------------------------+-------------------------+ | gep | Gets encoding profile (for | |mdash| | | | H.264 only). | | +-------------+--------------------------------+-------------------------+ | Enter ‘f’ | Forces IDR frame on video | |mdash| | | | encoder (for H.264 only). | | +-------------+--------------------------------+-------------------------+ .. todo:: In the last entry, what does "Enter 'f'" mean, as opposed to just "f"? Presumably configuration options go in a file (what file?), so "enter" would be a direction concerning the text editor, not the option, which makes no sense. Note, the same issue occurs in other places. Notes ##### - ``nvgstcapture-1.0`` generates image and video output files in the same directory as the application itself. - Filenames are respectively in these formats: - Image content: ``nvcamtest___.jpg`` - Video content: ``nvcamtest___.mp4`` Where: - ```` is the process ID. - ```` is the sensor ID. - ```` is a counter starting from 0 each time the application is run. - Rename or move files between runs to avoid overwriting results you want to save. - The application supports native capture mode (video only) by default. - Advanced features, such as setting zoom, brightness, exposure, and whitebalance levels, are not supported for USB cameras. nvgstplayer-1.0 Reference @@@@@@@@@@@@@@@@@@@@@@@@@ This section describes the operation of the the ``nvgstplayer-1.0`` application. nvgstplayer-1.0 Command Line Options #################################### .. note:: To list supported options, enter the command:: $ nvgstplayer-1.0 --help This table describes ``nvgstplayer-1.0`` command line options. .. raw:: html :file: AcceleratedGstreamer/Nvgstplayer10CommandLineOptions.htm nvgstplayer-1.0 Runtime Commands ################################ This table describes nvgstplayer runtime commands. .. raw:: html :file: AcceleratedGstreamer/Nvgstplayer10RuntimeCommands.htm Video Encoder Features @@@@@@@@@@@@@@@@@@@@@@ The respective GStreamer-1.0-based ``gst-v4l2`` video encoders support the following features: .. raw:: html :file: AcceleratedGstreamer/VideoEncoderFeatures~GstV4l2.htm Supported Cameras @@@@@@@@@@@@@@@@@ This section describes the supported cameras. CSI Cameras ########### - Jetson AGX Xavier series can capture camera images via CSI interface. .. todo:: Presumably also Xavier NX? Orin? - Jetson AGX Xavier series supports both YUV and RAW Bayer capture data. .. todo:: Presumably also Xavier NX? Orin? - GStreamer supports simultaneous capture from multiple CSI cameras. Support is validated using the ``nvgstcapture`` application. - Capture is validated for SDR, PWL HDR and DOL HDR modes for various sensors using the ``nvgstcapture`` application. - Jetson AGX Xavier series also support the MIPI CSI virtual channel feature. The virtual channel is a unique channel identifier used for multiplexed sensor streams sharing the same CSI port/brick and CSI stream through supported GMSL (Gigabit Multimedia Serial Link) aggregators. - GMSL + VC capture is validated on Jetson AGX Xavier series using the ``nvgstcapture`` application. The reference GMSL module (MAX9295-serializer/­MAX9296-deserializer/­IMX390-sensor) is used for validation purposes. USB 2.0 Cameras ############### The following camera has been validated on Jetson platforms running Jetson Linux with USB 2.0 ports. This camera is UVC compliant. - `Logitech C920 `__ Industrial Camera Details ######################### The following USB 3.0 industrial camera is validated on Jetson AGX Xavier series under Jetson Linux: - `See3CAM_CU130 `__ Characteristics of this camera are: - USB 3.0 - UVC compliant - 3840\ |times|\ 2160 at 30 FPS; 4224\ |times|\ 3156 at 13 FPS - Purpose\ |mdash|\ embedded navigation - Test using the nvgstcapture app. - Issues encountered: FPS cannot be fixed. Changes based on exposure. FPS cannot be changed. Needs payment to vendor to get the support added to their firmware.