Frameworks Support Matrix
Abstract
This support matrix is for NVIDIA® optimized frameworks. The matrix provides a single view into the supported software and specific versions that come packaged with the frameworks based on the container image.
Content that is included in <<>> brackets indicates new content from the previously published version.
The deep learning framework container packages follow a naming convention that is based on the year and month of the image release. For example, the 24.01 release of an image was released in January 2024.
24.xx container images
Container Image | 24.09 | 24.08 | 24.07 | 24.06 | 24.05 | 24.04 | 24.03 | 24.02 | 24.01 |
---|---|---|---|---|---|---|---|---|---|
DGX | |||||||||
DGX System |
|
|
|
|
|
|
|
|
|
Operating System |
Red Hat Enterprise Linux 9 / CentOS 91
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100) |
Red Hat Enterprise Linux 9 / CentOS 92
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100) |
Red Hat Enterprise Linux 9 / CentOS 93
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100) |
Red Hat Enterprise Linux 9 / CentOS 94
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100) |
Red Hat Enterprise Linux 9 / CentOS 95
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100) |
Red Hat Enterprise Linux 9 / CentOS 96
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100) |
Red Hat Enterprise Linux 9 / CentOS 97
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100) |
Red Hat Enterprise Linux 9 / CentOS 98
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
Red Hat Enterprise Linux 9 / CentOS 99
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
System Requirements | |||||||||
NVIDIA Driver | Release 24.09 is based on CUDA 12.6.1 which requires NVIDIA Driver release 560 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 470.57 (or later R470), 525.85 (or later R525), 535.86 (or later R535), or 545.23 (or later R545). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, R450, R460, R510, R520, R530, R545 and R555 drivers, which are not forward-compatible with CUDA 12.6. For a complete list of supported drivers, see the CUDA Application Compatibility topic. For more information, see CUDA Compatibility and Upgrades. . |
Release 24.08 is based on CUDA 12.6 which requires NVIDIA Driver release 560 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 470.57 (or later R470), 525.85 (or later R525), 535.86 (or later R535), or 545.23 (or later R545). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, R450, R460, R510, R520, R530, R545 and R555 drivers, which are not forward-compatible with CUDA 12.6. For a complete list of supported drivers, see the CUDA Application Compatibility topic. For more information, see CUDA Compatibility and Upgrades. . |
Release 24.07 is based on CUDA 12.5.1 which requires NVIDIA Driver release 555 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 470.57 (or later R470), 525.85 (or later R525), 535.86 (or later R535), or 545.23 (or later R545). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, R450, R460, R510, R520 and R545 drivers, which are not forward-compatible with CUDA 12.5. For a complete list of supported drivers, see the CUDA Application Compatibility topic. For more information, see CUDA Compatibility and Upgrades. |
Release 24.06 is based on CUDA 12.4.1, which requires NVIDIA Driver release 545 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 470.57 (or later R470), 525.85 (or later R525), 535.86 (or later R535), or 545.23 (or later R545). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, R450, R460, R510, and R520 drivers, which are not forward-compatible with CUDA 12.3. For a complete list of supported drivers, see the CUDA Application Compatibility topic. For more information, see CUDA Compatibility and Upgrades. |
Release 24.05 is based on CUDA 12.4.1, which requires NVIDIA Driver release 545 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 470.57 (or later R470), 525.85 (or later R525), 535.86 (or later R535), or 545.23 (or later R545). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, R450, R460, R510, and R520 drivers, which are not forward-compatible with CUDA 12.3. For a complete list of supported drivers, see the CUDA Application Compatibility topic. For more information, see CUDA Compatibility and Upgrades. |
Release 24.04 is based on CUDA 12.4.1, which requires NVIDIA Driver release 545 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 470.57 (or later R470), 525.85 (or later R525), 535.86 (or later R535), or 545.23 (or later R545). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, R450, R460, R510, and R520 drivers, which are not forward-compatible with CUDA 12.3. For a complete list of supported drivers, see the CUDA Application Compatibility topic. For more information, see CUDA Compatibility and Upgrades. |
Release 24.03 is based on CUDA 12.4.0.41, which requires NVIDIA Driver release 545 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 470.57 (or later R470), 525.85 (or later R525), 535.86 (or later R535), or 545.23 (or later R545). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, R450, R460, R510, and R520 drivers, which are not forward-compatible with CUDA 12.3. For a complete list of supported drivers, see the CUDA Application Compatibility topic. For more information, see CUDA Compatibility and Upgrades. |
Release 24.02 is based on CUDA 12.3.2, which requires NVIDIA Driver release 545 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 470.57 (or later R470), 525.85 (or later R525), 535.86 (or later R535), or 545.23 (or later R545). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, R450, R460, R510, and R520 drivers, which are not forward-compatible with CUDA 12.3. For a complete list of supported drivers, see the CUDA Application Compatibility topic. For more information, see CUDA Compatibility and Upgrades. |
Release 24.01 is based on CUDA 12.3.2, which requires NVIDIA Driver release 545 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 470.57 (or later R470), 525.85 (or later R525), 535.86 (or later R535), or 545.23 (or later R545). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, R450, R460, R510, and R520 drivers, which are not forward-compatible with CUDA 12.3. For a complete list of supported drivers, see the CUDA Application Compatibility topic. For more information, see CUDA Compatibility and Upgrades. |
GPU Model | |||||||||
Base Container Image (included in all containers) | |||||||||
Container OS | Ubuntu 22.04 | Ubuntu 22.04 | Ubuntu 22.04 | Ubuntu 22.04 | Ubuntu 22.04 | Ubuntu 22.04 | Ubuntu 22.04 | Ubuntu 22.04 | Ubuntu 22.04 |
CUDA | NVIDIA CUDA 12.6.1 | NVIDIA CUDA 12.6 | NVIDIA CUDA 12.5.1 | NVIDIA CUDA 12.5.0.23 | NVIDIA CUDA 12.4.1 | NVIDIA CUDA 12.4.1 | NVIDIA CUDA 12.4.0.41 | NVIDIA CUDA 12.3.2 | NVIDIA CUDA 12.3.2 |
cuBLAS | NVIDIA cuBLAS 12.6.3.1 | NVIDIA cuBLAS 12.6.0.22 | NVIDIA cuBLAS 12.5.3.2 | NVIDIA cuBLAS 12.5.2.13 | NVIDIA cuBLAS 12.4.5.8 | NVIDIA cuBLAS 12.4.5.8 | NVIDIA cuBLAS 12.4.2.65 | NVIDIA cuBLAS 12.3.4.1 | NVIDIA cuBLAS 12.3.4.1 |
cuDNN | 9.4.0.58 | 9.3.0.75 | 9.2.1.18 | 9.1.0.70 | 9.1.0.70 | 9.1.0.70 | 9.0.0.306 | 9.0.0.306 | 8.9.7.29 |
cuTENSOR | 2.0.2.5 | 2.0.2.5 | 2.0.2.4 | 2.0.1.2 | 2.0.1.2 | 2.0.1.2 | 2.0.1.2 | 2.0 | 2.0 |
DALI | 1.41 | 1.40 | 1.39 | 1.38 | 1.37.1 | 1.36 | 1.35 | 1.34 | 1.33 |
NCCL | 2.22.3 | 2.22.3 | 2.22.3 | 2.21.5 | 2.21.5 | 2.21.5 | 2.20 | 2.19.4 | 2.19.4 |
TensorRT | TensorRT 10.4.0.26 | TensorRT 10.3.0.26 | TensorRT 10.2.0.19 | TensorRT 10.1.0.27 | TensorRT 10.0.1.6 | TensorRT 8.6.3 | TensorRT 8.6.3 | TensorRT 8.6.3 | TensorRT 8.6.1.6 |
rdma-core | 39.0 | 39.0 | 39.0 | 39.0 | 39.0 | 39.0 | 39.0 | 39.0 | 39.0 |
NVIDIA HPC-X | 2.20 with
|
2.19 with
|
2.19 with
|
2.19 with
|
2.19 with
|
2.18 with
|
2.1 with
|
2.16rc4 with
|
2.16rc4 with
|
GDRcopy | 2.3 | 2.3 | |||||||
Nsight Compute | 2024.3.1.2 | 2024.3.0.15 | 2024.2.1.2 | 2024.2.0.16 | 2024.1.1.4 | 2024.1.1.4 | 2024.1.0.13 | 2023.3.1.1 | 2023.3.1.1 |
Nsight Systems | 2024.4.2.133 | 2024.4.2.133 | 2024.4.2.133 | 2024.2.3.38 | 2024.2.1.106 | 2024.2.1.106 | 2024.2.1.38 | 2023.4.1.97 | 2023.4.1.97 |
NVIDIA Optimized Frameworks | |||||||||
DGL | 2.4.0 (including DGL-Graphbolt, a recently released GNN dataloader library which has achieved state-of-the-art performance on NVIDIA GPUs).
|
- | 2.3.0 (including DGL-Graphbolt, a recently released GNN dataloader library which has achieved state-of-the-art performance on NVIDIA GPUs).
|
- | 2.2+22aea5c (including DGL-Graphbolt, a recently released GNN dataloader library which has achieved state-of-the-art performance on NVIDIA GPUs).
|
2.1+e1f7738 (including DGL-Graphbolt, a recently released GNN dataloader library which has achieved state-of-the-art performance on NVIDIA GPUs).
|
2.1+7c51cd16 including:
|
- | 1.2 including:
|
Multi arch support: x86, Arm SBSA | - | Multi arch support: x86, Arm SBSA | - | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | - | Multi arch support: x86, Arm SBSA | |
Docker image size: 24.5 GB | - | Docker image size: 23.3 GB | - | Docker image size: 21.0 GB | Docker image size: 23.6 GB | Docker image size: 23.3 GB | - | Docker image size: 24.8 GB | |
JAX | - | - | - | - | - | JAX v0.4.26 including:
|
- | - | - |
- | - | - | - | - | Multi arch support: x86 only | - | |||
- | - | - | - | - | Docker image size: 10.1GB | ||||
NVIDIA Optimized Deep Learning Framework, powered by Apache MXNet | - | - | - | 1.9.1 including:
|
1.9.1 including:
|
1.9.1 including:
|
1.9.1 including:
|
1.9.1 including:
|
1.9.1 including:
|
- | - | - | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | |
- | - | - | Docker image size: 12.7 GB | Docker image size: 12.6 GB | Docker image size: 12.2 GB | Docker image size: 12.1 GB | Docker image size: 12.0 GB | Docker image size: 12.1 GB | |
PaddlePaddle | 2.6.1 including:
|
2.6.1 including:
|
2.6.1 including:
|
2.6.0 including:
|
2.6.0 including:
|
2.6.0 including:
|
2.6.0 including:
|
2.5.2 including:
|
2.5.2 including:
|
Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | |
Docker image size: 11.8 GB | Docker image size: 11.7 GB | Docker image size: 11.3 GB | Docker image size: 10.0 GB | Docker image size: 9.93 GB | Docker image size: 9.58 GB | Docker image size: 9.55 GB | Docker image size: 8.94 GB | Docker image size: 9.01 GB | |
PyG | PyG 2.6.0PyTorch2.5.0a0+b465a5843bincluding
|
- | PyG 2.6.0PyTorch2.4.0a0+3bcc3cddb5including
|
- | PyG 2.6.0PyTorch2.4.0a0+07cecf4including
|
||||
Multi arch support: x86, Arm SBSA | - | Multi arch support: x86, Arm SBSA | - | Multi arch support: x86, Arm SBSA | |||||
Docker image size: 22.7 GB | - | Docker image size: 22.2 GB | - | Docker image size: 20.5 GB | |||||
PyTorch | 2.5.0a0+b465a5843bincluding
|
2.5.0a0+872d972e41including
|
2.4.0a0+3bcc3cddb5including
|
2.4.0a0+f70bd71a48including
|
2.4.0a0+07cecf4including
|
2.3.0a0+6ddf5cf85eincluding
|
2.3.0a0+40ec155e58including
|
2.3.0a0+ebedce2including
|
2.2.0a0+81ea7a4including
|
Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | |
Docker image size: 21 GB | Docker image size: 20.4 GB | Docker image size: 18.32 GB | Docker image size: 19.2 GB | Docker image size: 18.8 GB | Docker image size: 20.0 GB | Docker image size: 19.8 GB | Docker image size: 22.2 GB | Docker image size: 22.0 GB | |
TensorFlow | 2.16.1 including | 2.16.1 including | 2.16.1 including | 2.16.1 including | 2.15.0 including | 2.15.0 including | 2.15.0 including | 2.15.0 including | 2.14.0 including |
Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | |
Docker image size: 16.1 GB | Docker image size: 15.3 GB | Docker image size: 15.18 GB | Docker image size: 13.8 GB | Docker image size: 13.5 GB | Docker image size: 13.9 GB | Docker image size: 13.9 GB | Docker image size: 14.4 GB | Docker image size: 14.4 GB | |
TensorRT | TensorRT 10.4.0.26 | TensorRT 10.3.0.26 | TensorRT 10.2.0 | TensorRT 10.1.0 | TensorRT 10.0.1.6 | TensorRT 8.6.3 | TensorRT 8.6.3 | TensorRT 8.6.3 | TensorRT 8.6.1.6 |
Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | |
Docker image size: 9.13 GB | Docker image size: 9.07 GB | Docker image size: 9.60 GB | Docker image size: 7.56 GB | Docker image size: 7.51 GB | Docker image size: 7.16 GB | Docker image size: 7.15 GB | Docker image size: 7.05 GB | Docker image size: 7.46 GB | |
Triton Inference Server |
Triton Inference Server also supports:
2.47 including
|
Triton Inference Server also supports:
2.47 including
|
Triton Inference Server also supports:
2.47 including
|
Triton Inference Server also supports:
2.47 including
|
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.46 including
|
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.45 including
|
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.43 including |
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.43 including |
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.41 including |
Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | |
Docker image size: 16.9 GB | Docker image size: 16.8 GB | Docker image size: 15.98 GB | Docker image size: 15.5 GB | Docker image size: 15.3 GB | Docker image size: 14.8 GB | Docker image size: 14.9 GB | Docker image size: 13.8 GB | Docker image size: 14.7 GB | |
TensorFlow For Jetson | 2.16.1 | 2.16.1 | 2.16.0 | 2.15.0 | 2.15.0 | 2.15.0 | 2.15.0 | 2.15.0 | |
PyTorch for Jetson | 2.5.0a0+b465a5843b | 2.5.0a0+872d972e41 | 2.4.0a0+3bcc3cddb5 | 2.4.0a0+f70bf71 | 2.4.0a0+07cecf4 | 2.3.0a0+6ddf5cf85e | 2.3.0a0+40ec155e58 | 2.3.0a0+ebedce2 | |
Triton for Jetson | 2.50.0 | 2.49.0 | 2.48.0 | 2.47.0 | 2.46.0 | 2.45.0 | 2.44.0 | 2.43.0 |
Content that is included in <<>> brackets indicates new content from the previously published version.
The deep learning framework container packages follow a naming convention that is based on the year and month of the image release. For example, the 23.01 release of an image was released in January 2023.
23.xx container images
Container Image | 23.12 | 23.11 | 23.10 | 23.09 | 23.08 | 23.07 | 23.06 | 23.05 | 23.04 | 23.03 | 23.02 | 23.01 | |||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
DGX | |||||||||||||||
DGX System |
|
|
|
|
|
|
|
|
|
|
|
|
|||
Operating System |
Red Hat Enterprise Linux 9 / CentOS 910
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
Red Hat Enterprise Linux 9 / CentOS 911
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
Red Hat Enterprise Linux 9 / CentOS 91
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
Red Hat Enterprise Linux 9 / CentOS 91
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
Red Hat Enterprise Linux 9 / CentOS 91
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
Red Hat Enterprise Linux 9 / CentOS 91
Red Hat Enterprise Linux 8 / CentOS 81 (All DGX systems except DGX Station A100)
|
Red Hat Enterprise Linux 9 / CentOS 91
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
Red Hat Enterprise Linux 7 / CentOS 71
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
|||
System Requirements | |||||||||||||||
NVIDIA Driver | Release 23.12 is based on CUDA 12.3.2, which requires NVIDIA Driver release 545 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 450.51 (or later R450), 470.57 (or later R470), 510.47 (or later R510), or 525.85 (or later R525), 535.86 (or later R535), or 545.23 (or later R545). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, R460, and R520 drivers, which are not forward-compatible with CUDA 12.3. For a complete list of supported drivers, see the CUDA Application Compatibility topic. For more information, see CUDA Compatibility and Upgrades. |
Release 23.11 is based on CUDA 12.3.0, which requires NVIDIA Driver release 545 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 450.51 (or later R450), 470.57 (or later R470), 510.47 (or later R510), or 525.85 (or later R525), 535.86 (or later R535), or 545.23 (or later R545). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, R460, and R520 drivers, which are not forward-compatible with CUDA 12.3. For a complete list of supported drivers, see the CUDA Application Compatibility topic. For more information, see CUDA Compatibility and Upgrades. |
Release 23.10 is based on CUDA 12.2.2, which requires NVIDIA Driver release 535 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 450.51 (or later R450), 470.57 (or later R470), 510.47 (or later R510), 515.65 (or later R515), or 525.85 (or later R525), or 535.86 (or later R535). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, and R460 drivers, which are not forward-compatible with CUDA 12.2. 13 |
Release 23.09 is based on CUDA 12.2.1, which requires NVIDIA Driver release 535 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 450.51 (or later R450), 470.57 (or later R470), 510.47 (or later R510), 515.65 (or later R515), or 525.85 (or later R525), or 535.86 (or later R535). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, and R460 drivers, which are not forward-compatible with CUDA 12.2. 3 |
Release 23.08 is based on CUDA 12.2.1, which requires NVIDIA Driver release 535 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 450.51 (or later R450), 470.57 (or later R470), 510.47 (or later R510), 515.65 (or later R515), or 525.85 (or later R525), or 535.86 (or later R535). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, and R460 drivers, which are not forward-compatible with CUDA 12.2. 3 |
Release 23.07 is based on CUDA 12.1.1, which requires NVIDIA Driver release 530 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 450.51 (or later R450), 470.57 (or later R470), 510.47 (or later R510), 515.65 (or later R515), 525.85 (or later R525), or 530.30 (or later R530). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, and R460 drivers, which are not forward-compatible with CUDA 12.1. 3 |
Release 23.06 is based on CUDA 12.1.1, which requires NVIDIA Driver release 530 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 450.51 (or later R450), 470.57 (or later R470), 510.47 (or later R510), 515.65 (or later R515), 525.85 (or later R525), or 530.30 (or later R530). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, and R460 drivers, which are not forward-compatible with CUDA 12.1. 3 |
Release 23.05 is based on CUDA 12.1.1, which requires NVIDIA Driver release 530 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 450.51 (or later R450), 470.57 (or later R470), 510.47 (or later R510), 515.65 (or later R515), 525.85 (or later R525), or 530.30 (or later R530). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, and R460 drivers, which are not forward-compatible with CUDA 12.1. 3 |
Release 23.04 is based on CUDA 12.1.0, which requires NVIDIA Driver release 530 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 450.51 (or later R450), 470.57 (or later R470), 510.47 (or later R510), 515.65 (or later R515), 525.85 (or later R525), or 530.30 (or later R530). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, and R460 drivers, which are not forward-compatible with CUDA 12.0. 3 |
Release 23.03 is based on CUDA 12.1.0, which requires NVIDIA Driver release 530 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 450.51 (or later R450), 470.57 (or later R470), 510.47 (or later R510), 515.65 (or later R515), 525.85 (or later R525), or 530.30 (or later R530). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, and R460 drivers, which are not forward-compatible with CUDA 12.0. 3 |
Release 23.02 is based on CUDA 12.0.1, which requires NVIDIA Driver release 525 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 450.51 (or later R450), 470.57 (or later R470), 510.47 (or later R510), 515.65 (or later R515), or 525.85 (or later R525). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, and R460 drivers, which are not forward-compatible with CUDA 12.0. 3 |
Release 23.01 is based on CUDA 12.0.1, which requires NVIDIA Driver release 525 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 450.51 (or later R450), 470.57 (or later R470), 510.47 (or later R510), 515.65 (or later R515), or 525.85 (or later R525). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, and R460 drivers, which are not forward-compatible with CUDA 12.0. 3 |
|||
GPU Model |
|
||||||||||||||
Base Container Image (included in all containers) | |||||||||||||||
Container OS | Ubuntu 22.04 | Ubuntu 22.04 | Ubuntu 22.04 | Ubuntu 22.04 | Ubuntu 22.04 | Ubuntu 22.04 | Ubuntu 22.04 | Ubuntu 22.04 | Ubuntu 20.04 | Ubuntu 20.04 | Ubuntu 20.04 | Ubuntu 20.04 | |||
CUDA | NVIDIA CUDA 12.3.2 | NVIDIA CUDA 12.3.0 | NVIDIA CUDA 12.2.2 | NVIDIA CUDA 12.2.1 | NVIDIA CUDA 12.2.1 | NVIDIA CUDA 12.1.1 | NVIDIA CUDA 12.1.1 | NVIDIA CUDA 12.1.1 | NVIDIA CUDA 12.1.0 | NVIDIA CUDA 12.1.0 | NVIDIA CUDA 12.0.1 | NVIDIA CUDA 12.0.1 | |||
cuBLAS | NVIDIA cuBLAS 12.3.4.1 | NVIDIA cuBLAS 12.3.2.1 | NVIDIA cuBLAS 12.2.5.6 | NVIDIA cuBLAS 12.2.5.6 | NVIDIA cuBLAS 12.2.5.1 | NVIDIA cuBLAS 12.1.3.1 | NVIDIA cuBLAS 12.1.3.1 | NVIDIA cuBLAS 12.1.3.1 | NVIDIA cuBLAS 12.1.3 | cuBLAS from CUDA 12.1.0 | 12.0.2 from CUDA | 12.0.2 from CUDA | |||
cuDNN | 8.9.7.29 | 8.9.6.50 | 8.9.5 | 8.9.5 | 8.9.4 | 8.9.3 | 8.9.2 | 8.9.1.23 | 8.9.0 | 8.8.1.3 | 8.7.0 | 8.7.0 | |||
cuTENSOR | 1.7.0.1 | 1.7.0.1 | 1.7.0.1 | 1.7.0.1 | 1.7.0.1 | 1.7.0.1 | 1.7.0.1 | 1.7.0.1 | 1.7.0 | 1.6.2.3 | 1.6.2.3 | 1.6.2.3 | |||
DALI | 1.32.0 | 1.31.0 | 1.30.0 | 1.29.0 | 1.28.0 | 1.27.0 | 1.26.0 | 1.25.0 | 1.24.0 | 1.23.0 | 1.22.0 | 1.21.0 | |||
NCCL | 2.19.3 | 2.19.3 | 2.19.3 | 2.18.5 | 2.18.3 | 2.18.3 | 2.18.1 | 2.18.1 | 2.17.1 | 2.17.1 | 2.16.5 | 2.16.5 | |||
TensorRT | TensorRT 8.6.1.6 | TensorRT 8.6.1.6 | TensorRT 8.6.1.6 | TensorRT 8.6.1.6 | TensorRT 8.6.1.6 | TensorRT 8.6.1.6 | TensorRT 8.6.1.6 | TensorRT 8.6.1.2 | TensorRT 8.6.1 | TensorRT 8.5.3 | TensorRT 8.5.3 | TensorRT 8.5.2.2 | |||
rdma-core | 39.0 | 39.0 | 39.0 | 39.0 | 39.0 | 39.0 | 39.0 | 36.0 | 36.0 | 36.0 | 36.0 | 36.0 | |||
NVIDIA HPC-X | 2.16 with
|
2.16 with
|
2.16 with
|
2.16 with
|
2.15 with
|
2.15 with
|
2.15 with
|
2.14 with
|
2.13 with
|
2.13 with
|
2.13 with
|
2.13 with
|
|||
GDRcopy | 2.3 | 2.3 | 2.3 | 2.3 | 2.3 | 2.3 | 2.3 | 2.3 | 2.3 | 2.3 | 2.3 | 2.3 | |||
Nsight Compute | 2023.3.1.1 | 2023.3.0.12 | 2023.2.1.3 | 2023.2.1.3 | 2023.2.1.3 | 2023.1.1.4 | 2023.1.1.4 | 2023.1.1.4 | 2023.1.0.15 | 2023.1.0.15 | 2022.4.1.6 | 2022.4.1.6 | |||
Nsight Systems | 2023.4.1 | 2023.3.1.92 | 2023.3.1.92 | 2023.3.1.92 | 2023.2.3.1001 | 2023.2.3.1001 | 2023.2.3.1001 | 2023.2 | 2023.1.1.127 | 2023.1.1.127 | 2022.5.1 | 2022.5.1 | |||
NVIDIA Optimized Frameworks | |||||||||||||||
DGL | - | 1.1.1 including:
|
- | 1.1.1 including:
|
- | 1.1.1 including:
|
- | - | - | - | - | - | |||
Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | ||||||||||||||
Docker image size: 23.4 GB | Docker image size: 20.8 GB | ||||||||||||||
JAX | - | - |
|
- |
|
||||||||||
Multi arch support: x86 only | Multi arch support: x86 only | ||||||||||||||
Kaldi | - | ||||||||||||||
Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | |||||
Docker image size: 9.13 GB | Docker image size: 9.1 GB | Docker image size: 9.16 GB | Docker image size: 9.36 GB | Docker image size: 9.14 GB | Docker image size: 9.29 GB | Docker image size: 9.19 GB | Docker image size: 10.9 GB | Docker image size: 11.1 GB | Docker image size: 11.8 GB | Docker image size: 11.1 GB | |||||
NVIDIA Optimized Deep Learning Framework, powered by Apache MXNet | 1.9.1 including:
|
1.9.1 including:
|
1.9.1 including:
|
1.9.1 including:
|
1.9.1 including:
|
1.9.1 including:
|
1.9.1 including:
|
1.9.1 including:
|
1.9.1 including:
|
1.9.1 including:
|
1.9.1 including:
|
1.9.1 including:
|
|||
Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | ||||
Docker image size: 12.1 GB | Docker image size: 12.1 GB | Docker image size: 12 GB | Docker image size: 12.1 GB | Docker image size: 12.1 GB | Docker image size: 12.1 GB | Docker image size: 12.0 GB | Docker image size: 12.1 GB | Docker image size: 13.1 GB | Docker image size: 13.2 GB | Docker image size: 13.9 GB | Docker image size: 13.1 GB | ||||
PaddlePaddle | 2.5.2 including:
|
2.5.2 including:
|
2.5.1 including:
|
2.5.0 including:
|
2.5.0 including:
|
2.4.1 including:
|
2.4.1 including:
|
No 23.05 release. | 2.4.1 including:
|
2.4.1 including:
|
2.4.0 including:
|
2.3.2 including:
|
|||
Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | - | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | ||||
Docker image size: 8.98 GB | Docker image size: 8.98 GB | Docker image size: 8.94 GB | Docker image size: 8.99 GB | Docker image size: 9.02 GB | Docker image size: 8.58 GB | Docker image size: 8.59 GB | - | Docker image size: 9.47 GB | Docker image size: 9.74 GB | Docker image size: 10.5 GB | Docker image size: 9.41 GB | ||||
PyTorch | 2.2.0a0+81ea7a48including
|
2.2.0a0+6a974be including
|
2.1.0a0+32f93b1 including
|
2.1.0a0+32f93b1 including
|
2.1.0a0+29c30b1 including
|
2.1.0a0+b5021ba including
|
2.1.0a0+4136153 including
|
2.0.0 including
|
2.1.0a0+fe05266f including
|
2.0.0a0+1767026 including
|
1.14.0a0+410ce96 including
|
1.14.0a0+410ce96 including
|
|||
Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | ||||
Docker image size: 21.9 GB | Docker image size: 21.9 GB | Docker image size: 22.1 GB | Docker image size: 22.0 GB | Docker image size: 20.6 GB | Docker image size: 19.8 GB | Docker image size: 19.7 GB | Docker image size: 22 GB | Docker image size: 20.4 GB | Docker image size: 20.4 GB | Docker image size: 20.5 GB | Docker image size: 19.7 GB | ||||
TensorFlow | 2.14.0 including | 2.14.0 including | 2.13.0 including | 2.13.0 including | 2.13.0 including | 2.12.0 including | 2.12.0 including | 2.12.0 including | 2.12.0 including | 2.11.0 including | 1.15.5 including | 2.11.0 including | 1.15.5 including | 2.11.0 including | 1.15.5 including |
Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | |
Docker image size: 14.3 GB | Docker image size: 14.1 GB | Docker image size: 14.2 GB | Docker image size: 14.2 GB | Docker image size: 14.2 GB | Docker image size: 13.9 GB | Docker image size: 14.3 GB | Docker image size: 14.2 GB | Docker image size: 15.4 GB | Docker image size: 15.9 GB | Docker image size: 16.3 GB | Docker image size: 16.6 GB | Docker image size: 17.0 GB | Docker image size: 15.9 GB | Docker image size: 16.2 GB | |
TensorRT | TensorRT 8.6.1.6 | TensorRT 8.6.1.6 | TensorRT 8.6.1.6 | TensorRT 8.6.1.6 | TensorRT 8.6.1.6 | TensorRT 8.6.1.6 | TensorRT 8.6.1.6 | TensorRT 8.6.1.2 | TensorRT 8.6.1 | TensorRT 8.5.3 | TensorRT 8.5.3 | TensorRT 8.5.2.2 | |||
Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | ||||
Docker image size: 7.45 GB | Docker image size: 7.45 GB | Docker image size: 7.41 GB | Docker image size: 7.47 GB | Docker image size: 7.5 GB | Docker image size: 7.45 GB | Docker image size: 7.45 GB | Docker image size: 7.5 GB | Docker image size: 8.05 GB | Docker image size: 8.32 GB | Docker image size: 9.03 GB | Docker image size: 8.3 GB | ||||
Triton Inference Server |
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.41 including |
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.40 including |
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.39 including |
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.38 including
|
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.37 including
|
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.36 including
|
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.35 including
|
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.34 including
|
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.33 including
|
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.32 including
|
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.31 including |
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.30 including |
|||
Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | ||||
Docker image size: 14.7 GB | Docker image size: 14.3 GB | Docker image size: 12.6 GB | Docker image size: 12.6 GB | Docker image size: 12.4 GB | Docker image size: 12.3 GB | Docker image size: 12.3 GB | Docker image size: 12.5 GB | Docker image size: 13 GB | Docker image size: 14.7 GB | Docker image size: 15.3 GB | Docker image size: 15.3 GB | ||||
TensorFlow For Jetson | TensorFlow 2.14.0 for Jetson | TensorFlow 2.12.0 for Jetson | TensorFlow 2.12.0 for Jetson | TensorFlow 2.12.0 for Jetson | TensorFlow 1.15.5 and 2.10.1 for Jetson | TensorFlow 1.15.5 and 2.10.1 for Jetson | TensorFlow 1.15.5 and 2.10.1 for Jetson | ||||||||
PyTorch for Jetson | 2.1.0a0+4136153 for Jetson | 2.0.0 for Jetson | 2.1.0a0+fe05266f for Jetson | PyTorch 2.0.0a0+1767026 for Jetson | PyTorch 1.14.0a0+44dac51 for Jetson | PyTorch 1.14.0a0+44dac51 for Jetson | |||||||||
TensorFlow Wheel for x86 | - | - | - | - | TensorFlow 1.15.5 for x86 | TensorFlow 1.15.5 for x86 | TensorFlow 1.15.5 for x86 | ||||||||
Triton for Jetson | Triton Inference Server 2.36.0 for Jetson | Triton Inference Server 2.35.0 for Jetson | Triton Inference Server 2.34.0 for Jetson | Triton Inference Server 2.33.0 for Jetson | Triton Inference Server 2.32.0 for Jetson | Triton Inference Server 2.31.0 for Jetson | Triton Inference Server 2.30.0 for Jetson |
Content that is included in <<>> brackets indicates new content from the previously published version.
The deep learning framework container packages follow a naming convention that is based on the year and month of the image release. For example, the 22.03 release of an image was released in March 2022.
22.xx container images
Container Image | 22.12 | 22.11 | 22.10 | 22.09 | 22.08 | 22.07 | 22.06 | 22.05 | 22.04 | 22.03 | 22.02 | 22.01 | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
DGX | ||||||||||||||||||||||||
DGX System |
|
|
|
|
|
|
|
|
|
|
|
|
||||||||||||
Operating System |
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
DGX OS
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
DGX OS
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
||||||||||||
System Requirements | ||||||||||||||||||||||||
NVIDIA Driver | Release 22.12 is based on CUDA 11.8.0, which requires NVIDIA Driver release 520 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 450.51 (or later R450), 470.57 (or later R470), 510.47 (or later R510), or 515.65 (or later R515). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, and R460 drivers, which are not forward-compatible with CUDA 11.8. 3 |
Release 22.11 is based on CUDA 11.8.0, which requires NVIDIA Driver release 520 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 450.51 (or later R450), 470.57 (or later R470), 510.47 (or later R510), or 515.65 (or later R515). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, and R460 drivers, which are not forward-compatible with CUDA 11.8. 3 |
Release 22.10 is based on CUDA 11.8.0, which requires NVIDIA Driver release 520 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 450.51 (or later R450), 470.57 (or later R470), 510.47 (or later R510), or 515.65 (or later R515). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, and R460 drivers, which are not forward-compatible with CUDA 11.8. 3 |
Release 22.09 is based on CUDA 11.8.0, which requires NVIDIA Driver release 520 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 450.51 (or later R450), 470.57 (or later R470), 510.47 (or later R510), or 515.65 (or later R515). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, and R460 drivers, which are not forward-compatible with CUDA 11.8. 3 |
Release 22.08 is based on CUDA 11.7.1, which requires NVIDIA Driver release 515 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 450.51 (or later R450), 470.57 (or later R470), or 510.47 (or later R510). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, and R460 drivers, which are not forward-compatible with CUDA 11.8. 3 |
Release 22.07 is based on CUDA 11.7 Update 1 Preview, which requires NVIDIA Driver release 515 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 450.51 (or later R450), 470.57 (or later R470), or 510.47 (or later R510). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, and R460 drivers, which are not forward-compatible with CUDA 11.7. 3 |
Release 22.06 is based on CUDA 11.7 Update 1 Preview, which requires NVIDIA Driver release 515 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 450.51 (or later R450), 470.57 (or later R470), or 510.47 (or later R510). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, and R460 drivers, which are not forward-compatible with CUDA 11.7. 3 |
Release 22.05 is based on CUDA 11.7, which requires NVIDIA Driver release 515 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 450.51 (or later R450), 470.57 (or later R470), or 510.47 (or later R510). The CUDA driver's compatibility package only supports particular drivers. Thus, users should upgrade from all R418, R440, and R460 drivers, which are not forward-compatible with CUDA 11.7. 3 |
Release 22.04 is based on NVIDIA CUDA® 11.6.2, which requires NVIDIA Driver release 510 or later. However, if you are running on a Data Center GPU (for example, T4 or any other Tesla board), you may use NVIDIA driver release 418.40 (or later R418), 440.33 (or later R440), 450.51 (or later R450), 460.27 (or later R460), or 470.57 (or later R470). The CUDA driver's compatibility package only supports particular drivers. 3 | Release 22.03 is based on NVIDIA CUDA® 11.6.1, which requires NVIDIA Driver release 510 or later. However, if you are running on a Data Center GPU (for example, T4 or any other Tesla board), you may use NVIDIA driver release 418.40 (or later R418), 440.33 (or later R440), 450.51 (or later R450), 460.27 (or later R460), or 470.57 (or later R470). The CUDA driver's compatibility package only supports particular drivers. 3 | Release 22.02 is based on NVIDIA CUDA 11.6.0, which requires NVIDIA Driver release 510 or later. However, if you are running on a Data Center GPU (for example, T4 or any other Tesla board), you may use NVIDIA driver release 418.40 (or later R418), 440.33 (or later R440), 450.51 (or later R450), 460.27 (or later R460), or 470.57 (or later R470). The CUDA driver's compatibility package only supports particular drivers. 3 | Release 22.01 is based on NVIDIA CUDA 11.6.0, which requires NVIDIA Driver release 510 or later. However, if you are running on a Data Center GPU (for example, T4 or any other Tesla board), you may use NVIDIA driver release 418.40 (or later R418), 440.33 (or later R440), 450.51 (or later R450), 460.27 (or later R460), or 470.57 (or later R470). The CUDA driver's compatibility package only supports particular drivers. 3 | ||||||||||||
GPU Model | ||||||||||||||||||||||||
Base Container Image (included in all containers) | ||||||||||||||||||||||||
Container OS | Ubuntu 20.04 | Ubuntu 20.04 | Ubuntu 20.04 | Ubuntu 20.04 | Ubuntu 20.04 | Ubuntu 20.04 | Ubuntu 20.04 | Ubuntu 20.04 | Ubuntu 20.04 | Ubuntu 20.04 | Ubuntu 20.04 | Ubuntu 20.04 | ||||||||||||
CUDA | NVIDIA CUDA 11.8.0 | NVIDIA CUDA 11.8.0 | NVIDIA CUDA 11.8.0 | NVIDIA CUDA 11.8.0 | NVIDIA CUDA 11.7 Update 1 | NVIDIA CUDA 11.7 Update 1 Preview | NVIDIA CUDA 11.7 Update 1 Preview | NVIDIA CUDA 11.7.0 | NVIDIA CUDA 11.6.2 | NVIDIA CUDA 11.6.1 | NVIDIA CUDA 11.6.0 | NVIDIA CUDA 11.6.0 | ||||||||||||
cuBLAS | 11.11.3.6 | 11.11.3.6 | 11.11.3.6 | 11.11.3.6 | 11.10.3.66 | 11.10.3.66 | 11.10.3.66 | 11.10.1.25 | 11.9.3.115 | 11.8.1.74 | 11.8.1.74 | 11.8.1.74 | ||||||||||||
cuDNN | 8.7.0 GA | 8.7.0.80 | 8.6.0.163 | 8.6.0.163 | 8.5.0.96 | 8.4.1 | 8.4.1 | 8.4.0.27 | 8.4.0.27 | 8.3.3.40 | 8.3.2.44 | 8.3.2.44 | ||||||||||||
cuTENSOR | 1.6.1.5 | 1.6.1.5 | 1.6.1.5 | 1.6.1.5 | 1.6.0.2 | 1.5.0.3 | 1.5.0.3 | 1.5.0.3 | 1.5.0.3 | 1.5.0.1 | 1.4 | 1.4 | ||||||||||||
DALI | 1.20.0 | 1.18.0 | 1.18.0 | 1.17.0 | 1.16.0 | 1.15.0 | 1.14.0 | 1.13.0 | 1.12.0 | 1.11.1 | 1.10.0 | 1.9.0 | ||||||||||||
NCCL | 2.15.5 | 2.15.5 | 2.15.5 | 2.15.1 | 2.12.12 | 2.12.12 | 2.12.12 | 2.12.10 | 2.12.10 | 2.12.9 | 2.11.4 | 2.11.4 | ||||||||||||
TensorRT | TensorRT 8.5.1 | TensorRT 8.5.1 | TensorRT 8.5.0.12 | TensorRT 8.5.0.12 | TensorRT 8.4.2.4 | TensorRT 8.4.1 | TensorRT 8.2.5 | TensorRT 8.2.5 | TensorRT 8.2.4.2 | TensorRT 8.2.3 | TensorRT 8.2.3 | TensorRT 8.2.2 | ||||||||||||
rdma-core | 36.0 | 36.0 | 36.0 | 36.0 | 36.0 | 36.0 | 36.0 | 36.0 | 36.0 | 36.0 | 36.0 | 36.0 | ||||||||||||
NVIDIA HPC-X | 2.13 with
|
2.12.2tp1 with
|
2.12.2tp1 with
|
2.12.1a0 with
|
2.10 with
|
2.10 with
|
2.10 with
|
2.10 with
|
2.10 with
|
2.10 with
|
2.10 with
|
2.10 with
|
||||||||||||
GDRcopy | 2.3 | 2.3 | 2.3 | 2.3 | 2.3 | 2.3 | 2.3 | 2.3 | 2.3 | 2.3 | 2.3 | 2.3 | ||||||||||||
Nsight Systems | 2022.4.2.1 | 2022.4.2.1 | 2022.4.2.1 | 2022.4.1 | 2022.1.3.18 | 2022.1.3.3 | 2022.1.3.3 | 2022.1.3.3 | 2022.2.1.31-5fe97ab | 2021.5.2.53 | 2021.5.2.53 | 2021.5.2.53 | ||||||||||||
NVIDIA Optimized Frameworks | ||||||||||||||||||||||||
Kaldi | ||||||||||||||||||||||||
Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | |||||||||||||
Docker image size: 10.4 GB | Docker image size: 10.3 GB | Docker image size: 10.3 GB | Docker image size: 10.3 GB | Docker image size: 8.89 GB | Docker image size: 9.07 GB | Docker image size: 9 GB | Docker image size: 9.11 GB | Docker image size: 9.01 GB | Docker image size: 9 GB | Docker image size: 9 GB | Docker image size: 8.96 GB | |||||||||||||
NVIDIA Optimized Deep Learning Framework, powered by Apache MXNet | 1.9.1 including:
|
1.9.1 including:
|
1.9.1 including:
|
1.9.1 including:
|
1.9.1 including: | 1.9.1 including: | 1.9.1 including: | 1.9.0.rc6 including: | 1.9.0.rc6 including:
|
1.9.0.rc6 including:
|
Release paused | Release paused | ||||||||||||
Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | |||||||||||||||
Docker image size: 12 GB | Docker image size: 11.7 GB | Docker image size: 11.7 GB | Docker image size: 11.7 GB | Docker image size: 9.97 GB | Docker image size: 10.2 GB | Docker image size: 10.1 GB | Docker image size: 10.7 GB | Docker image size: 10.6 GB | Docker image size: 11.0 GB | |||||||||||||||
PyTorch | 1.14.0a0+410ce96 including
|
1.13.0a0+936e930 including
|
1.13.0a0+d0d6b1f including
|
1.13.0a0+d0d6b1f including
|
1.13.0a0+d321be6 including | 1.13.0a0+08820cb including | 1.13.0a0+340c412 including | 1.12.0a0+8a1a93a including | 1.12.0a0+bd13bc6 including | 1.12.0a0+2c916ef including | 1.11.0a0+17540c5c including | 1.11.0a0+bfe5ad28 including | ||||||||||||
Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA (beta) | Multi arch support: x86, Arm SBSA (beta) | Multi arch support: x86, Arm SBSA (beta) | Multi arch support: x86, Arm SBSA (beta) | |||||||||||||
Docker image size: 18.3 GB | Docker image size: 17.3 GB | Docker image size: 16.9 GB | Docker image size: 16.8 GB | Docker image size: 14.6 GB | Docker image size: 14.8 GB | Docker image size: 14.6 GB | Docker image size: 14.6 GB | Docker image size: 14.1 GB | Docker image size: 14.6 GB | Docker image size: 14.4 GB | Docker image size: 14.8 GB | |||||||||||||
TensorFlow | 2.10.1 including | 1.15.5 including | 2.10.0 including | 1.15.5 including | 2.10.0 including | 1.15.5 including | 2.9.1 including | 1.15.5 including | 2.9.1 including | 1.15.5 including | 2.9.1 including | 1.15.5 including | 2.9.1 including | 1.15.5 including | 2.8.0 including | 1.15.5 including | 2.8.0 including | 1.15.5 including | 2.8.0 including | 1.15.5 including | 2.7.0 including | 1.15.5 including | 2.7.0 including | 1.15.5 including |
Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA (beta) | Multi arch support: x86, Arm SBSA (beta) | Multi arch support: x86, Arm SBSA (beta) | Multi arch support: x86, Arm SBSA (beta) | Multi arch support: x86, Arm SBSA (beta) | Multi arch support: x86, Arm SBSA (beta) | Multi arch support: x86, Arm SBSA (beta) | Multi arch support: x86, Arm SBSA (beta) | |
Docker image size: 14.3 GB | Docker image size: 14.8 GB | Docker image size: 14.4 GB | Docker image size: 15.0 GB | Docker image size: 14.4 GB | Docker image size: 14.9 GB | Docker image size: 14.1 GB | Docker image size: 14.9 GB | Docker image size: 12 GB | Docker image size: 12.8 GB | Docker image size: 12.2 GB | Docker image size: 13.0 GB | Docker image size: 12.2 GB | Docker image size: 14.4 GB | Docker image size: 12.2 GB | Docker image size: 14.4 GB | Docker image size: 13.1 GB | Docker image size: 14.4 GB | Docker image size: 13.6 GB | Docker image size: 14.9 GB | Docker image size: 13.1 GB | Docker image size: 14.5 GB | Docker image size: 13.1 GB | Docker image size: 15.1 GB | |
TensorRT | TensorRT 8.5.1 | TensorRT 8.5.1 | TensorRT 8.5.0.12 | TensorRT 8.5.0.12 including: | TensorRT 8.4.2.4 including: | TensorRT 8.4.1 including: | TensorRT 8.2.5 including: | TensorRT 8.2.5 including: | TensorRT 8.2.4.2 including: | TensorRT 8.2.3 including: | TensorRT 8.2.2 including: | TensorRT 8.2.2 including: | ||||||||||||
Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA (beta) | Multi arch support: x86, Arm SBSA (beta) | Multi arch support: x86, Arm SBSA (beta) | Multi arch support: x86, Arm SBSA (beta) | |||||||||||||
Docker image size: 7.61 GB | Docker image size: 7.25 GB | Docker image size: 7.51 GB | Docker image size: 7.49 GB | Docker image size: 6.09 GB | Docker image size: 6.27 GB | Docker image size: 6.21 GB | Docker image size: 6.33 GB | Docker image size: 6.21 GB | Docker image size: 6.21 GB | Docker image size: 6.21 GB | Docker image size: 6.17 GB | |||||||||||||
Triton Inference Server |
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.29.0 including |
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.28.0 including |
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.27.0 including |
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.26.0 including |
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.25.0 including |
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.24.0 including |
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.23.0 including |
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.22.0 including |
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.21.0 including |
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.20.0 including |
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.19.0 including |
In addition to the hardware and software listed above, Triton Inference Server also supports:
2.18.0 including |
||||||||||||
Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA | Multi arch support: x86, Arm SBSA (beta) | Multi arch support: x86, Arm SBSA (beta) | Multi arch support: x86, Arm SBSA (beta) | Multi arch support: x86, Arm SBSA (beta) | |||||||||||||
Docker image size: 14 GB | Docker image size: 13.8 GB | Docker image size: 13.4 GB | Docker image size: 13.7 GB | Docker image size: 11.7 GB | Docker image size: 11.9 GB | Docker image size: 11 GB | Docker image size: 11 GB | Docker image size: 11.4 GB | Docker image size: 12.1 GB | Docker image size: 12.3 GB | Docker image size: 12.4 GB | |||||||||||||
2.3.2 including:
|
2.3.2 including:
|
2.3.2 including:
|
2.3.0 including:
|
2.3.0 including:
|
2.3.0 including:
|
2.2.2 including:
|
2.2.2 including:
|
|||||||||||||||||
Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | |||||||||||||||||
Docker image size: 8.72 GB | Docker image size: 8.48 GB | Docker image size: 8.46 GB | Docker image size: 8.44 GB | Docker image size: 8.43 GB | Docker image size: 8.43 GB | Docker image size: 7.98 GB | Docker image size: 8.09 GB | |||||||||||||||||
TensorFlow For Jetson | TensorFlow 1.15.5 and 2.10.1 for Jetson | TensorFlow 1.15.5 and 2.10.0 for Jetson | TensorFlow 1.15.5 and 2.10.0 for Jetson | TensorFlow 1.15.5 and 2.9.1 for Jetson | This release was skipped. | TensorFlow 1.15.5 and 2.9.1 for Jetson | TensorFlow 1.15.5 and 2.9.1 for Jetson | TensorFlow 1.15.5 and 2.8.0 for Jetson | TensorFlow 1.15.5 and 2.8.0 for Jetson | TensorFlow 1.15.5 and 2.8.0 for Jetson | TensorFlow 1.15.5 and 2.7.0 for Jetson | TensorFlow 1.15.5 and 2.7.0 for Jetson | ||||||||||||
PyTorch for Jetson | PyTorch 1.14.0a0+410ce96 for Jetson | PyTorch 1.13.0a0+936e930 for Jetson | PyTorch 1.13.0a0+d0d6b1f for Jetson | PyTorch 1.13.0a0+d0d6b1f for Jetson | This release was skipped. | PyTorch 1.13.0a0+08820cb for Jetson | PyTorch 1.13.0a0+340c412 for Jetson | PyTorch 1.12.0a0+8a1a93a for Jetson | PyTorch 1.12.0a0+84d1cb9 for Jetson | PyTorch 1.12.0a0+2c916ef for Jetson | This release was skipped. | PyTorch 1.11.0a0+bfe5ad28 for Jetson | ||||||||||||
TensorFlow Wheel for x86 | TensorFlow 1.15.5 for x86 | TensorFlow 1.15.5 for x86 | TensorFlow 1.15.5 for x86 | TensorFlow 1.15.5 for x86 | TensorFlow 1.15.5 for x86 | TensorFlow 1.15.5 for x86 | TensorFlow 1.15.5 for x86 | TensorFlow 1.15.5 for x86 | TensorFlow 1.15.5 for x86 | TensorFlow 1.15.5 for x86 | TensorFlow 1.15.5 for x86 | TensorFlow 1.15.5 for x86 | ||||||||||||
Triton for Jetson | Triton Inference Server 2.27.0 for Jetson | Triton Inference Server 2.27.0 for Jetson | Triton Inference Server 2.27.0 for Jetson | Triton Inference Server 2.26.0 for Jetson | Triton Inference Server 2.24.0 for Jetson | Triton Inference Server 2.24.0 for Jetson | Triton Inference Server 2.23.0 for Jetson | Triton Inference Server 2.22.0 for Jetson | Triton Inference Server 2.21.0 for Jetson | Triton Inference Server 2.20.0 for Jetson | Triton Inference Server 2.19.0 for Jetson | Triton Inference Server 2.18.0 for Jetson |
Content that is included in <<>> brackets indicates new content from the previously published version.
The deep learning framework container packages follow a naming convention that is based on the year and month of the image release. For example, the 21.02 release of an image was released in February 2021.
21.xx container images
Container Image | 21.12 | 21.11 | 21.10 | 21.09 | 21.08 | 21.07 | 21.06 | 21.05 | 21.04 | 21.03 | 21.02 | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
DGX | ||||||||||||||||||||||
DGX System |
|
|
|
|
|
|
|
|
|
|
|
|||||||||||
Operating System | DGX OS
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
DGX OS
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
DGX OS
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
DGX OS
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
DGX OS
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
DGX OS
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
DGX OS
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
DGX OS
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
DGX OS
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
DGX OS
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
DGX OS
Red Hat Enterprise Linux 7 / CentOS 72
Red Hat Enterprise Linux 8 / CentOS 82 (All DGX systems except DGX Station A100)
|
|||||||||||
NVIDIA Driver | Release 21.12 is based on NVIDIA CUDA 11.5.0, which requires NVIDIA Driver release 495 or later. However, if you are running on a Data Center GPU (for example, T4 or any other Tesla board), you may use NVIDIA driver release 418.40 (or later R418), 440.33 (or later R440), 450.51 (or later R450), 460.27 (or later R460), or 470.57 (or later R470). The CUDA driver's compatibility package only supports particular drivers. 3 | Release 21.11 is based on NVIDIA CUDA 11.5.0, which requires NVIDIA Driver release 495 or later. However, if you are running on a Data Center GPU (for example, T4 or any other Tesla board), you may use NVIDIA driver release 418.40 (or later R418), 440.33 (or later R440), 450.51 (or later R450), 460.27 (or later R460), or 470.57 (or later R470). The CUDA driver's compatibility package only supports particular drivers. 3 | Release 21.10 is based on NVIDIA CUDA 11.4.2 with cuBLAS 11.6.5.2, which requires NVIDIA Driver release 470 or later. However, if you are running on Data Center GPUs (formerly Tesla), for example, T4, you may use NVIDIA driver release 418.40 (or later R418), 440.33 (or later R440), 450.51 (or later R450), or 460.27 (or later R460). The CUDA driver's compatibility package only supports particular drivers. 3 |
Release 21.09 is based on NVIDIA CUDA 11.4.2, which requires NVIDIA Driver release 470 or later. However, if you are running on Data Center GPUs (formerly Tesla), for example, T4, you may use NVIDIA driver release 418.40 (or later R418), 440.33 (or later R440), 450.51 (or later R450), or 460.27 (or later R460). The CUDA driver's compatibility package only supports particular drivers. 3 |
Release 21.08 is based on NVIDIA CUDA 11.4.1, which requires NVIDIA Driver release 470 or later. However, if you are running on Data Center GPUs (formerly Tesla), for example, T4, you may use NVIDIA driver release 418.40 (or later R418), 440.33 (or later R440), 450.51 (or later R450), or 460.27 (or later R460). The CUDA driver's compatibility package only supports particular drivers. 3 |
Release 21.07 is based on NVIDIA CUDA 11.4.0, which requires NVIDIA Driver release 470 or later. However, if you are running on Data Center GPUs (formerly Tesla), for example, T4, you may use NVIDIA driver release 418.40 (or later R418), 440.33 (or later R440), 450.51 (or later R450), or 460.27 (or later R460). The CUDA driver's compatibility package only supports particular drivers. 3 |
Release 21.06 is based on NVIDIA CUDA 11.3.1, which requires NVIDIA Driver release 465.19.01 or later. However, if you are running on Data Center GPUs (formerly Tesla), for example, T4, you may use NVIDIA driver release 418.40 (or later R418), 440.33 (or later R440), 450.51 (or later R450), or 460.27 (or later R460). The CUDA driver's compatibility package only supports particular drivers. 3 |
Release 21.05 is based on NVIDIA CUDA 11.3.0, which requires NVIDIA Driver release 465.19.01 or later. However, if you are running on Data Center GPUs (formerly Tesla), for example, T4, you may use NVIDIA driver release 418.40 (or later R418), 440.33 (or later R440), 450.51 (or later R450), or 460.27 (or later R460). The CUDA driver's compatibility package only supports particular drivers.3 |
Release 21.04 is based on NVIDIA CUDA 11.3.0, which requires NVIDIA Driver release 465.19.01 or later. However, if you are running on Data Center GPUs (formerly Tesla), for example, T4, you may use NVIDIA driver release 418.40 (or later R418), 440.33 (or later R440), 450.51 (or later R450), or 460.27 (or later R460). The CUDA driver's compatibility package only supports particular drivers. 3 |
Release 21.03 is based on NVIDIA CUDA 11.2.1, which requires NVIDIA Driver release 460.32.03 or later. However, if you are running on Data Center GPUs (formerly Tesla), for example, T4, you may use NVIDIA driver release 418.40 (or later R418), 440.33 (or later R440), 450.51(or later R450). The CUDA driver's compatibility package only supports particular drivers. 3 |
Release 21.02 is based on NVIDIA CUDA 11.2.0, which requires NVIDIA Driver release 460.27.04 or later. However, if you are running on Data Center GPUs (formerly Tesla), for example, T4, you may use NVIDIA driver release 418.40 (or later R418), 440.33 (or later R440), 450.51(or later R450). The CUDA driver's compatibility package only supports particular drivers.3 |
|||||||||||
GPU Model | ||||||||||||||||||||||
Base Container Image | ||||||||||||||||||||||
Container OS | Ubuntu 20.04 | Ubuntu 20.04 | Ubuntu 20.04 | Ubuntu 20.04 | Ubuntu 20.04 | Ubuntu 20.04 | Ubuntu 20.04 | Ubuntu 20.04 | Ubuntu 20.04 | Ubuntu 20.04 | Ubuntu 20.04 | |||||||||||
CUDA | NVIDIA CUDA 11.5.0 | NVIDIA CUDA 11.5.0 | NVIDIA CUDA 11.4.2 with cuBLAS 11.6.5.2 | 11.4.2 | 11.4.1 | 11.4.0 | 11.3.1 | 11.3.0 | 11.3.0 | 11.2.1 | 11.2.0 | |||||||||||
cuBLAS | 11.7.3.1 | 11.7.3.1 | 11.6.1.51 | 11.6.1.51 | 11.5.4 | 11.5.2.43 | 11.5.1.109 | 11.5.1.101 | 11.5.1.101 | 11.4.1.1026 | 11.3.1.68 | |||||||||||
cuDNN | 8.3.1.22 | 8.3.0.96 | 8.2.4.15 | 8.2.4.15 | 8.2.2.26 | 8.2.2.26 | 8.2.1 | 8.2.0.51 | 8.2.0.41 | 8.1.1 | 8.1.0.77 | |||||||||||
NCCL | 2.11.4 | 2.11.4 | 2.11.4 | 2.11.4 | 2.10.3 | 2.10.3 | 2.9.9 | 2.9.8 | 2.9.6 | 2.8.4 | 2.8.4 | |||||||||||
TensorRT | TensorRT 8.2.1.8 | TensorRT 8.0.3.4 | ||||||||||||||||||||
NVIDIA Optimized Frameworks | ||||||||||||||||||||||
Kaldi |
|
|
|
|
|
|
|
|
|
|||||||||||||
Multi arch support: x86 only | Multi arch support: x86 only | Multi arch support: x86 only | ||||||||||||||||||||
Docker image size: 8.78 GB | Docker image size: 8.69 GB | Docker image size: 9.16 GB | Docker image size: 9.12 GB | Docker image size: 8.86 GB | Docker image size: 8.77 GB | Docker image size: 8.62 GB | Docker image size: 8.43 GB | Docker image size: 8.3 GB | Docker image size: 8.62 GB | Docker image size: 8.73 GB | ||||||||||||
DIGITS | Release paused | Release paused | Release paused | 6.1.1 including
|
6.1.1 including
|
6.1.1 including
|
6.1.1 including
|
6.1.1 including
|
6.1.1 including
|
6.1.1 including | 6.1.1 including | |||||||||||
- | - | - | Docker image size: 14.6 GB | Docker image size: 14.9 GB | Docker image size: 15 GB | Docker image size: 14.7 GB | Docker image size: 15.1 GB | Docker image size: 15.1 GB | Docker image size: 15.4 GB | Docker image size: 15.5 GB | ||||||||||||
NVIDIA Optimized Deep Learning Framework, powered by Apache MXNet | Release paused | Release paused | Release paused | 1.9.0.rc6 including
|
1.9.0.rc6 including
|
1.9.0.rc3 including
|
1.9.0.rc2 including
|
1.8.0 including
|
1.8.0 including
|
1.8.0 including
|
1.8.0.rc2 including
|
|||||||||||
- | - | - | Docker image size: 11.2 GB | Docker image size: 10.9 GB | Docker image size: 10.6 GB | Docker image size: 10.4 GB | Docker image size: 10.8 GB | Docker image size: 10.7 GB | Docker image size: 11.1 GB | Docker image size: 10.8 GB | ||||||||||||
PyTorch | 1.11.0a0+b6df043 including
|
1.11.0a0+b6df043 including
|
1.10.0a0+0aef44c including
|
1.10.0a0+3fd9dcf including
|
1.10.0a0+3fd9dcf including
|
1.10.0a0+ecc3718 including
|
1.9.0a0+c3d40fd including
|
1.9.0a0+2ecb2c7 including
|
1.9.0a0+2ecb2c7 including
|
1.9.0a0+df837d0 including
|
1.8.0a0+52ea372 including
|