Reference Overview

The reference section is where you’ll find detailed documentation about the various concepts and features of AI Workbench. We recommend starting with this article to get a high-level overview of what the tool has to offer. After that, feel free to explore the other topics in any order that suits your needs.

Here’s what you can expect to find in the reference section:

  • Conceptual overviews of AI Workbench concepts

  • Detailed explanations of AI Workbench’s core features and functionality

  • Step-by-step instructions for certain features and flows

AI Workbench is a versatile tool designed for AI and data science development. It is an ideal solution for deploying simple AI applications, whether you’re working on a laptop, workstation, virtual machine, or server, both in the cloud and data center.

Key Features:

  • Containerized development environments: AI Workbench manages and deploys containerized environments, ensuring consistent and reproducible setups for your AI projects.

  • Client-server architecture: The server handles all the heavy lifting, while the client (desktop app or command-line interface) provides the user interface. This allows you to interact with various compute resources through the CLI or desktop app on your primary machine (e.g. laptop, workstation).

  • Platform compatibility: Install the server on a wide range of compute resources, and the Desktop app or CLI on macOS, Windows, or Ubuntu, making AI Workbench a flexible and adaptable tool for your AI development needs.

  • Single instance use-cases: AI Workbench is optimized for single instance use-cases, providing a streamlined and focused development experience with up to 8 GPUs.

AI Workbench itself is composed of four primary components:

  • Server: This is the central component where all the heavy lifting is done that exposes a GraphQL API for the clients. Sometimes also referred to as the AI Workbench Service

  • Desktop App: This user-friendly cross-platform application that provides the graphical user interface for AI Workbench.

  • Command-Line Interface (CLI): For users who prefer working in the terminal, AI Workbench provides a powerful and intuitive CLI.

  • Credential Manager: The credential manager is a small application that integrates with your host’s keychain or secret store.

AI Workbench also relies on additional NVIDIA and 3rd party dependencies.

Hint

Learn More about the AI Workbench system here

AI Workbench should be installed locally on your primary system, such as your laptop or workstation, which you directly use on a regular basis. This setup provides you with the user interface, allowing you to connect to various other systems as needed.

Additionally, AI Workbench can be installed on a remote workstation, server or virtual machine. You can then connect to this remote installation from your local AI Workbench Desktop or Command Line Interface (CLI). We refer to this remote compute resource as a remote location.

You have the flexibility to connect multiple remote locations, or you can choose to use none at all. The choice is entirely up to you and depends on your specific use case and available resources.

Previous Hybrid RAG Quickstart
Next AI Workbench System
© Copyright © 2024, NVIDIA Corporation. Last updated on Apr 29, 2024.