NVIDIA Holoscan SDK v0.6
Holoscan v0.6

SDK Installation

The section below refers to the installation of the Holoscan SDK referred to as the development stack, designed for NVIDIA Developer Kits based on HoloPack or JetPack, and for x86_64 Linux compute platforms, ideal for development and testing of the SDK.

Note

For Holoscan Developer Kits such as the IGX Orin Developer Kit, an alternative option is the deployment stack, based on OpenEmbedded (Yocto build system) instead of Ubuntu. This is recommended to limit your stack to the software components strictly required to run your Holoscan application. The runtime Board Support Package (BSP) can be optimized with respect to memory usage, speed, security and power requirements.

Set up your developer kit:

Developer Kit

User Guide

L4T

GPU Mode

NVIDIA IGX Orin Guide HoloPack 2.0 iGPU or* dGPU
NVIDIA Clara AGX Guide HoloPack 1.2 dGPU*
NVIDIA Jetson AGX Orin Guide JetPack 5.1.1 iGPU

* iGPU and dGPU can be used concurrently on a single developer kit in dGPU mode with the L4T Compute Assist container

Note

For ConnectX support on the IGX Orin and Clara AGX developer kits, install the MOFED drivers (>=5.8). They can also be installed by selecting the Rivermax option in SDK Manager, available if you are part of the Rivermax SDK program.

If the developer kit is in dGPU mode, reconfigure its GPUDirect driver after installing MOFED to enable RDMA (link).

You’ll need the following to use the Holoscan SDK on x86_64:

Additional software dependencies might be needed based on how you choose to install the SDK (see section below).

Refer to the Additional Setup and Third-Party Hardware Setup sections for additional prerequisites.

We provide multiple ways to install and run the Holoscan SDK:

NGC Container

Debian Package

Python Wheels

Runtime libraries Included Included Included
Python module 3.8 3.8 3.8 to 3.11
C++ headers and
CMake config
Included Included N/A
Examples (+ source) Included Included retrieve from
GitHub
Sample datasets Included retrieve from
NGC
retrieve from
NGC
CUDA 11 runtime 1 Included automatically 2
installed
require manual
installation
NPP support 3 Included automatically 2
installed
require manual
installation
TensorRT support 4 Included automatically 2
installed
require manual
installation
Vulkan support 5 Included automatically 2
installed
require manual
installation
V4L2 support 6 Included automatically 2
installed
require manual
installation
Torch support 7 Included require manual 8
installation
require manual 8
installation
Rivermax support 9 add on top 10
of the image
require manual 11
installation
require manual 11
installation
CLI support needs docker dind
with buildx plugin
on top of the image
needs docker w/
buildx plugin
needs docker w/
buildx plugin

Refer to the documentation in each of these for specific install and run instructions.

Not sure what to choose?

  • The Holoscan container image on NGC it the safest way to ensure all the dependencies are present with the expected versions (including torch 8). It is the simplest way to run the embedded examples, while still allowing you to create your own C++ and Python Holoscan application on top of it. These benefits come with the standard inconvenience that exist when using Docker, such as more complex run instructions for proper configuration. Also, supporting Rivermax or the CLI require more work than the other solutions at this time.

  • If you are confident in your ability to manage dependencies on your own in your host environment, the Holoscan Debian package should provide all the capabilities needed to use the Holoscan SDK.

  • If you are not interested in the C++ API but just need to work in Python, or want to use a newer version than Python 3.8 (up to 3.11), you can use the Holoscan python wheels on PyPI (just pip install holoscan). While they are the easiest solution to get started, you might need additional work to setup your environment with adequate dependencies depending on your needs.

Need more control over the SDK?

The Holoscan SDK source repository is open-source and provides reference implementations as well as infrastructure for building the SDK yourself.

Attention

We only recommend building the SDK from source if you need to build it with debug symbols or other options not used as part of the published packages. If you want to write your own operator or application, you can use the SDK as a dependency (and contribute to HoloHub). If you need to make other modifications to the SDK, file a feature or bug request.


[1]

CUDA 11.4+ (< 12.0) is required. Already installed on NVIDIA developer kits with HoloPack and JetPack.

[2](1,2,3,4,5)

Debian installation on x86_64 requires the latest cuda-keyring package to automatically install all dependencies

[3]

NPP 11.4+ needed for the FormatConverter operator. Already installed on NVIDIA developer kits with HoloPack and JetPack.

[4]

TensorRT 8.2.3+ and cuDNN needed for the Inference operator. Already installed on NVIDIA developer kits with HoloPack and JetPack.

[5]

Vulkan 1.2.131+ loader needed for the HoloViz operator (+ libegl1 for headless rendering). Already installed on NVIDIA developer kits with HoloPack and JetPack.

[6]

V4L2 1.18+ needed for the V4L2 operator. Already installed on NVIDIA developer kits with HoloPack and JetPack.

[7]

Torch support requires LibTorch 1.12+, TorchVision 0.14.1+, OpenBLAS 0.3.8+ (all systems) and OpenMPI (aarch64 only)

[8](1,2,3)

To get LibTorch and TorchVision on aarch64 (NVIDIA devkits), build them from source, download our pre-built installation, or copy them from the holoscan arm64 container (in /opt).

[9]

Tested with Rivermax SDK 1.20

[10]

Supporting the Rivermax SDK in a container also requires adding the Mellanox OFED user space in that container

[11](1,2)

Rivermax SDK and OFED drivers are installed on NVIDIA developer kits with [SDKM][sdkm] though the [Rivermax SDK program][rivermax-program]

Previous Getting Started with Holoscan
Next Additional Setup
© Copyright 2022-2023, NVIDIA. Last updated on Feb 9, 2024.