Compiling DALI from Source

Bare Metal build


For the recommended dependency versions please check

Required Component


Linux x64

GCC or later


clang and python-clang bindings are needed for compile time code generation. The easiest way to obtain them is ‘pip install clang libclang’

Boost or later

Modules: preprocessor.


nvJPEG library

This can be unofficially disabled. See below.


Supported version: 3.11.1

CMake or later

libjpeg-turbo or later

This can be unofficially disabled. See below.

libtiff or later

This can be unofficially disabled. See below. Note: libtiff should be built with zlib support

FFmpeg or later

We recommend using version 4.2.2 compiled following the instructions below.

libsnd or later

We recommend using version 1.0.28 compiled following the instructions below.

OpenCV or later

Supported version: 4.3.0

(Optional) liblmdb or later

(Optional) GPU Direct Storage

Only libcufile is required for the build process, and the installed header needs to land in /usr/local/cuda/include directory.

One or more of the following Deep Learning frameworks:


TensorFlow installation is required to build the TensorFlow plugin for DALI.


Items marked “unofficial” are community contributions that are believed to work but not officially tested or maintained by NVIDIA.


This software uses the FFmpeg licensed code under the LGPLv2.1. Its source can be downloaded from here.

FFmpeg was compiled using the following command line:

./configure \
--prefix=/usr/local \
--disable-static \
--disable-programs \
--disable-doc \
--disable-avdevice \
--disable-swresample \
--disable-swscale \
--disable-postproc \
--disable-w32threads \
--disable-os2threads \
--disable-dct \
--disable-dwt \
--disable-error-resilience \
--disable-lsp \
--disable-lzo \
--disable-mdct \
--disable-rdft \
--disable-fft \
--disable-faan \
--disable-pixelutils \
--disable-autodetect \
--disable-iconv \
--enable-shared \
--enable-avformat \
--enable-avcodec \
--enable-avfilter \
--disable-encoders \
--disable-hwaccels \
--disable-muxers \
--disable-protocols \
--enable-protocol=file \
--disable-indevs \
--disable-outdevs  \
--disable-devices \
--disable-filters \
--disable-bsfs \
--enable-bsf=h264_mp4toannexb,hevc_mp4toannexb,mpeg4_unpack_bframes && \


This software uses the libsnd licensed under the LGPLv2.1. Its source can be downloaded from here.

libsnd was compiled using the following command line:

./configure && make

Build DALI

  1. Get DALI source code:

git clone --recursive
  1. Create a directory for CMake-generated Makefiles. This will be the directory, that DALI’s built in.

mkdir build
cd build
  1. Run CMake. For additional options you can pass to CMake, refer to Optional CMake Build Parameters.

cmake -D CMAKE_BUILD_TYPE=Release ..
  1. Build. You can use -j option to execute it in several threads

make -j"$(nproc)"

Install Python Bindings

In order to run DALI using Python API, you need to install Python bindings

cd build
pip install dali/python


Although you can create a wheel here by calling pip wheel dali/python, we don’t really recommend doing so. Such whl is not self-contained (doesn’t have all the dependencies) and it will work only on the system where you built DALI bare-metal. To build a wheel that contains the dependencies and might be therefore used on other systems, follow Using Docker builder - recommended.

Verify the Build (Optional)

Obtain Test Data

You can verify the build by running GTest and Nose tests. To do so, you’ll need DALI_extra repository, which contains test data. To download it follow DALI_extra README. Keep in mind, that you need git-lfs to properly clone DALI_extra repo. To install git-lfs, follow this tutorial.

Set Test Data Path

DALI uses DALI_EXTRA_PATH environment variable to localize the test data. You can set it by invoking:

$ export DALI_EXTRA_PATH=<path_to_DALI_extra>
e.g. export DALI_EXTRA_PATH=/home/yourname/workspace/DALI_extra

Run Tests

DALI tests consist of 2 parts: C++ (GTest) and Python (usually Nose, but that’s not always true). To run the tests there are convenient targets for Make, that you can run after building finished

cd <path_to_DALI>/build
make check-gtest check-python

Building DALI with Clang (Experimental)


This build is experimental. It is neither maintained nor tested. It is not guaranteed to work. We recommend using GCC for production builds.

make -j"$(nproc)"

Optional CMake Build Parameters

  • BUILD_PYTHON - build Python bindings (default: ON)

  • BUILD_TEST - include building test suite (default: ON)

  • BUILD_BENCHMARK - include building benchmarks (default: ON)

  • BUILD_LMDB - build with support for LMDB (default: OFF)

  • BUILD_NVTX - build with NVTX profiling enabled (default: OFF)

  • BUILD_NVJPEG - build with nvJPEG support (default: ON)

  • BUILD_NVJPEG2K - build with nvJPEG2k support (default: OFF)

  • BUILD_LIBTIFF - build with libtiff support (default: ON)

  • BUILD_FFTS - build with ffts support (default: ON)

  • BUILD_LIBSND - build with libsnd support (default: ON)

  • BUILD_NVOF - build with NVIDIA OPTICAL FLOW SDK support (default: ON)

  • BUILD_NVDEC - build with NVIDIA NVDEC support (default: ON)

  • BUILD_NVML - build with NVIDIA Management Library (NVML) support (default: ON)

  • BUILD_CUFILE - build with GPU Direct Storage support support (default: ON)

  • VERBOSE_LOGS - enables verbose loging in DALI. (default: OFF)

  • WERROR - treat all build warnings as errors (default: OFF)

  • BUILD_WITH_ASAN - build with ASAN support (default: OFF). To run issue:

  • BUILD_DALI_NODEPS - disables support for third party libraries that are normally expected to be available in the system

  • LINK_DRIVER - enables direct linking with driver libraries or an appropriate stub instead of dlopen it in the runtime (removes the requirement to have clang-python bindings available to generate the stubs)


Enabling this option effectively results in only the most basic parts of DALI to compile (C++ core and kernels libraries). It is useful when wanting to use DALI processing primitives (kernels) directly without the need to use DALI’s executor infrastructure.

LD_LIBRARY_PATH=. ASAN_OPTIONS=symbolize=1:protect_shadow_gap=0 ASAN_SYMBOLIZER_PATH=$(shell which llvm-symbolizer)

Where *X* depends on used compiler version, for example GCC 7.x uses 4. Tested with GCC 7.4, CUDA 10.0
and libasan.4. Any earlier version may not work.
  • DALI_BUILD_FLAVOR - Allow to specify custom name sufix (i.e. ‘nightly’) for nvidia-dali whl package

  • (Unofficial) BUILD_JPEG_TURBO - build with libjpeg-turbo (default: ON)

  • (Unofficial) BUILD_LIBTIFF - build with libtiff (default: ON)


DALI release packages are built with the options listed above set to ON and NVTX turned OFF. Testing is done with the same configuration. We ensure that DALI compiles with all of those options turned OFF, but there may exist cross-dependencies between some of those features.

Following CMake parameters could be helpful in setting the right paths:

  • FFMPEG_ROOT_DIR - path to installed FFmpeg

  • NVJPEG_ROOT_DIR - where nvJPEG can be found (from CUDA 10.0 it is shipped with the CUDA toolkit so this option is not needed there)

  • libjpeg-turbo options can be obtained from libjpeg CMake docs page

  • protobuf options can be obtained from protobuf CMake docs page

Cross-compiling for aarch64 Jetson Linux (Docker)


Support for aarch64 Jetson Linux platform is experimental. Some of the features are available only for x86-64 target and they are turned off in this build.


Download the JetPack 4.4 SDK for NVIDIA Jetson using the SDK Manager, following the instruction provided here: Then select CUDA for the host. After download process has been completed move cuda-repo-ubuntu1804-10-2-local-10.2.89-440.40_1.0-1_amd64.deb and cuda-repo-cross-aarch64-10-2-local-10.2.89_1.0-1_all.deb from the download folder to main DALI folder (they are required for cross build).

Build the aarch64 Jetson Linux Build Container

docker build -t nvidia/dali:tools_aarch64-linux -f docker/Dockerfile.cuda_aarch64.deps .
docker build -t nvidia/dali:builder_aarch64-linux --build-arg "AARCH64_CUDA_TOOL_IMAGE_NAME=nvidia/dali:tools_aarch64-linux" -f docker/ .


From the root of the DALI source tree

docker run -v $(pwd):/dali nvidia/dali:builder_aarch64-linux

The relevant python wheel will be in dali_root_dir/wheelhouse