Introduction

This is User Guide version 3.0.0 for the NVIDIA AI Workbench Beta release. It is significantly changed from the previous version. The new organization is as follows:

  • Overviews: Basic concepts and high-level explanations for overall understanding.

  • How-Tos: Step-by-step instructions for particular features or actions.

  • Quickstarts: A workflow to get up and running in the fewest steps possible.

  • Deep Dives: More detailed references for how things work or advanced features.

  • FAQs and Troubleshooting: Answers to common questions, Known issues, and checklists to work through them.

  • Roadmap: Features and capabilities that will be available at GA and beyond.

  • Streamlining and abstraction for Git and container-based development environments.

  • A simplified user experience with NVIDIA GPUs on local and remote systems.

  • A friendly UI that runs locally but also provides easy access to remote systems for a simple, uniform UX.

Some Concepts

More concepts will be explained in the rest of the User Guide, but the most basic ones are:

Location

A system (local or remote) with the appropriate AI Workbench components installed. Provides a predictable, managed user experience independent of the particular system.

Project

An AI Workbench managed Git repository that comes with a containerized environment. Provides predictable portability and reproducibility across Locations.

Current Capabilities

Broad Operating System Support

Windows 11 and Windows 10 of build number 19041 or higher; Ubuntu 22.04; macOS version 12 or higher. Provides a uniform experience independent of operating system or Location.

Fast, Self-Serve Installation

Click through install on local systems or command line install on remote systems in five minutes or less, depending on system resources.

Streamlined GPU Integration

Minimized user responsibilities to configure and maintain the NVIDIA stack and various runtime configurations. Provides a uniform experience independent of Location.

Easy Containerized Development Environments

Containerized JupyterLab with isolation and reproducibility without having to handle the details. Provides a customizable environment that is portable across Locations.

Simple Comprehensive Version Control

Version code, data, environments, and configurations in a speedy, Git-compliant way. Provides customization that individual users can adapt without breaking portability or collaboration.

Streamlined Git Servers and Container Registries

Integrates with Git servers and container registries to simplify management and collaboration for AI projects. Provides organization and transparency for multiple services without overhead for individual Projects or Locations.

Seamless Cross-Device and Cross-User

Portability and reproducibility across users and systems. Handles system and user based idiosyncrasies like mounts, credentials and secrets without high overhead.

Scalable Application Delivery and Collaboration:

Simple packaging for reproducible applications and workflows. Provides easy distribution through platforms like GitHub and GitLab with seamless inclusion of containers from sources like NVIDIA’s NGC.

Current Limitations

No Branching Support

AI Workbench does not support branching. Each repository is on the main branch.

No VS Code Support

AI Workbench does not include VS Code Support.

No Concurrent Multi-User Support for Remote Locations

Remote Locations do not support multiple users concurrently. Users may connect one at a time.

How it Works

Client-Server Architecture

You install AI Workbench on your primary local system, e.g. laptop, as well as on remote systems. The UI runs locally but provides access to the remote systems within a simple, unified user experience. You can develop and compute locally or remotely, and AI Workbench integrates with your Git servers to sync work across machines.

Git

AI Workbench uses Git under the hood and provides streamlining for simplification as well as to enable comprehensive versioning of files, environment and data. AI Workbench compatible Git repositories are called Projects, and only need to have some simple metadata and configuration information in them that lets AI Workbench automate everything needed for portability and reproducibility. This includes references and configuration files for the container that provides the isolated environment.

Containers

AI Workbench uses Docker or Podman under the hood and provides streamlining and features that simplify container builds and runtime configuration. Environment configuration is captured and versioned in files within the Git repository. Each Project (i.e. AI Workbench compatible Git repository) has the reference to the required base image and configuration files to build the environment.

NVIDIA GPUs

AI Workbench minimizes system dependencies for working with GPUs by installing CUDA at the container level. In addition, it provides an abstraction layer that simplifies configuring containers to use GPUs and system resources.

Applications and IDEs

AI Workbench Projects capture configuration for applications installed in the container, and AI Workbench uses this configuration to simplify the UX for deployment and use on local or remote systems. Applications like JupyterLab are containerized but AI Workbench can also work with native applications that run in the operating system like VS Code (Coming Soon).

Integrations

AI Workbench has a limited set of dedicated integrations for 3rd party platforms or services, like GitHub.com, GitLab.com, and ngc.nvidia.com. An Integration handles credentials and APIs for a uniform user experience independent of Projects and Locations.

AI Workbench has four basic components and a simple set of system dependencies.

Application Components

The Workbench Service

A single binary that provides the core application to automate things like Git and the container runtime. It is installed on your primary local system, as well as on remote systems.

The Credential Manager

A single binary that handles authentication for the various integrations. It is installed on your primary local system. On remote systems, it is installed but not used.

The Command-line Interface (CLI)

A single binary that provides a CLI. It is installed on your primary local system, as well as on remote systems.

The Desktop Application

An Electron application that is installed on your primary local system, e.g. your laptop, for a graphical user interface. It is not installed on remote systems.

Main System Dependencies

AI Workbench has some simple high-level dependencies, regardless of operating system.

Versioning

You need Git and Git-LFS to support Project versioning. If they are already installed, then AI Workbench will update to the latest versions only if the minimum version is not met. If they aren’t installed then AI Workbench will install them for you.

Containers

You need either Docker or Podman to support containerized environments. When you install AI Workbench you will need to select which one to use. There is some variation in how these are installed or updated according to which you choose and which operating system you have.

GPUs

If the system has GPUs, then you need the latest NVIDIA drivers. There is some variation in how the drivers are handled according to your operating system. You also need the latest NVIDIA Container Toolkit. If this is already installed, AI Workbench will update it to the latest version if the minimum version is not met. If it is not installed, AI Workbench will install it for you.

Installation Differences by Operating System

Windows

AI Workbench requires WSL2 on Windows. If it is not already installed, AI Workbench will install it for you. This requires a restart. In any case, AI Workbench will download a new WSL distribution called NVIDIA-Workbench. The Desktop App is installed on the Windows side, but the majority of the dependencies and application software are installed in the new WSL distribution. This includes the container runtime and the NVIDIA Container Toolkit. AI Workbench does not handle drivers on Windows at all, and you must install them yourself on the Windows side.

Ubuntu 22.04

AI Workbench will install or update the required dependencies. If a GPU is present on the system and no driver is detected, then AI Workbench will install them for you if you elect. This requires a restart.

macOS

AI Workbench will install or update the required dependencies.

  • If you want more high-level information, watch this long-form video (Coming Soon).

  • If you aren’t yet in the Beta, join it here!

  • For OS-specific installation instructions, visit the installation for Windows, Mac, local Ubuntu, or remote Ubuntu.

  • To understand the basics, start with an example Workbench Project on the Members Only DevZone page.

  • If you have AI Workbench installed but don’t know where to go next, check out the Desktop Application or the CLI Quickstart Guides.

  • If you are stuck on something, go to the How-Tos section or check out the forum DevZone

Previous NVIDIA AI Workbench
Next Locations
© Copyright 2023-2024, NVIDIA. Last updated on Jan 21, 2024.