Jetson Linux API Reference

35.2.1 Release
Building and Running

You can run the samples on Jetson without rebuilding them. However, if you modify those samples, you must rebuild them before running them.

For information on building the samples on a host Linux PC (x86), see Setting Up Cross-Platform Support.

Build and run the samples by following the procedures in this document:

  1. Export environment variables.
  2. Use Jetpack to install these programs:
    • NVIDIA® CUDA®
    • Opencv
    • cuDNN
    • NVIDIA® TensorRT, previously known as GIE
  3. Create symbolic links.
  4. Optionally, set up cross-compiler support.
  5. Build and run the samples.

Step 1. Export environment variables

  • Export the XDisplay with the following command:
      $ export DISPLAY=:0
    

Step 2: Use Jetpack to install CUDA/OpenCV/cuDNN/TensorRT

If you have already installed these libraries, you can skip the following steps.

  1. Download Jetpack from the following website:
     https://developer.nvidia.com/embedded/downloads
    
  2. Run the installation script from the host machine with the following commands:
     $ chmod +x  ./JetPack-L4T-<version>-linux-x64.run
     $ ./JetPack-L4T-<version>-linux-x64.run
    
  3. Select Development Environment.
  4. Select "custom" and click "clear action".
  5. Select "CUDA Toolkit", "OpenCV","cuDNN Package" and "TensorRT", and then install.
  6. For installation details, see the _installer folder.

Step 3: Create symbolic links

  • Create symbolic links with the following commands:
     $ cd /usr/lib/aarch64-linux-gnu
     $ sudo ln -sf libv4l2.so.0 libv4l2.so
    

Step 4: Set up cross-compiler support (Optional)

Step 5: Build and run the samples

  • Build and run, as described for each sample. | Directory Location Relative to ll_samples/samples | Description | |------------------------------------------------—|----------—| | 00_video_decode (video decode) | Decodes H.264, H.265, VP8, VP9, MPEG4, and MPEG2 video from a local file and then shares the YUV buffer with egl renderer. | | 01_video_encode (video encode) | Encodes YUV bitstream from local file and then write elementary H.264/H.265 into file. | | l4t_mm_02_video_dec_cuda | Decodes H.264/H.265 video from a local file and then shares the YUV buffer with CUDA to draw a black box in the left corner. | | l4t_mm_03_video_cuda_enc | Use CUDA to draw a black box in the YUV buffer and then feeds it to video encoder to generate an H.264/H.265 video file. | | l4t_mm_04_video_dec_trt | Uses simple TensorRT calls to save the bounding box info to a file. | | l4t_mm_05_jpeg_encode | Uses libjpeg-8b APIs to encode JPEG images from software-allocated buffers. | | l4t_mm_06_jpeg_decode | Uses libjpeg-8b APIs to decode a JPEG image from software-allocated buffers. | | l4t_mm_07_video_convert | Uses V4L2 APIs to do video format conversion and video scaling. | | 08_video_dec_drm (Direct Rendering Manager) | Uses the NVIDIA® Tegra® Direct Rendering Manager (DRM) to render video stream or UI. | | l4t_mm_09_argus_camera_jpeg | Simultaneously uses Libargus API to preview camera stream and libjpeg-8b APIs to encode JPEG images. | | l4t_mm_10_argus_camera_recording | Gets the real-time camera stream from the Libargus API and feeds it into the video encoder to generate H.264/H.265 video files. | | l4t_mm_12_v4l2_camera_cuda | Captures images from a V4L2 camera and shares the stream with CUDA engines to draw a black box in the upper left corner. | | l4t_mm_13_argus_multi_camera | Captures multiple cameras and composites them to one frame.| | 14_multivideo_decode (multi video decode) | Decodes multiple H.264, H.265, VP8, VP9, MPEG4, and MPEG2 videos from local files and writes YUV buffer into corresponding files. | | 15_multivideo_encode (multi video encode) | Encodes multiple YUV bitstreams from local files and writes elementary H.264/H.265/VP8/VP9 into corresponding files. | | 16_multivideo_encode (multi video transcode) | Transcodes multiple bitstreams from local files and writes elementary H.264/H.265/VP8/VP9 into corresponding files. | | unittest_samples/camera_unit_sample (capture with libv4l2_nvargus) | Unit level sample; uses libv4l2_nvargus to preview camera stream. | | unittest_samples/decoder_unit_sample (video decode unit sample) | Unit level sample; decodes H.264 video from a local file and dumps the raw YUV buffer. | | unittest_samples/encoder_unit_sample (video encode unit samples) | Unit level sample; encodes YUV bitstream from a local file and writes elementary H.264 bitstream into file. | | unittest_samples/transform_unit_sample (nvbuf_utils pixel format conversion) | Unit level sample; uses the nvbuf_utils utility to convert one colorspace YUV bitstream to another. | | backend (video analytics) | Performs intelligent video analytics on four concurrent video streams going through a decoding process using the on chip decoders, video scaling using on chip scalar, and GPU compute. | | l4t_mm_17_frontend | Performs independent processing on four different resolutions of video capture coming directly from camera. | | l4t_mm_18_v4l2_camera_cuda_rgb | Uses V4L2 image capturing with CUDA format conversion. |


For details on each sample's structure and the APIs they use, see Sample Applications in this reference.