AI Workbench Locations#
Overview#
- A location is a machine with AI Workbench installed.
Installing AI Workbench provides a managed layer on the system that simplifies creating, accessing and using containerized development environments on that machine.
- Locations provide a consistent experience regardless of the underlying operating system.
A Workbench location provides the same UI/UX regardless if it’s Windows, Mac or Ubuntu. The only differences between locations are the hardware capabilities and the projects on them.
- The Desktop App is not a location - It’s a UI.
The Desktop App is a lightweight Electron app that runs locally to provide visibility and access to different locations. It is not the full AI Workbench application.
- A location is not a cluster - There’s no kubernetes involved.
AI Workbench is built to run on a single instance, and the Desktop App lets you work with AI Workbench on different instances. However, Workbench has no notion of a “cluster”, so there is no handling of multiple networked locations.
Key Concepts#
- Remote Only Mode:
The initial state on your local machine when you first install the Desktop App. You cannot work locally yet but you can add remote locations immediately.
- Full Local Install:
Optionally installing the full AI Workbench application on your local system. It includes a managed installation or updating of a variety of dependencies such as a container runtime and Git-LFS.
- Location Manager:
The main application window that opens when you start the Desktop App. It provides features to add and manage locations, and it shows cards for your existing locations.
- Local Location:
Your local machine (laptop or desktop) with the full AI Workbench application installed. It shows up as the “local” card in the Location Manager after you successfully complete the full local install.
- Remote Location:
A virtual or bare-metal machine on a network that you can access via SSH. It must have AI Workbench installed, and the operating system must be Ubuntu.
- Location Window:
A separate application window that opens when you click a location card to activate a location. It provides visibility, access and management for the projects on that specific location.
- NVIDIA Brev:
An NVIDIA platform that lets you find and provision GPU instances on different cloud providers. AI Workbench integrates with Brev to streamline authentication and adding cloud instances as locations.
Local vs Remote#
- “local” is only available if you do the full local install.
After you install the Desktop App, you can optionally do the full local install to install the full AI Workbench application.
If you have NVIDIA GPUs on your local system, adding “local” allows you to work with them through AI Workbench.
- Use remote locations when you need better hardware or if you are blocked form working locally.
Remote locations make it easy for you to “lift and shift” workloads and development environments to machines with more scalable resources. It also provides an option if you aren’t allowed to do development and compute on your local machine.
- You can work on multiple locations at the same time.
Locations are independent, and the Desktop App (or CLI) lets you work on multiple locations at once. In addition, you can work on multiple projects on the same location at once.
Locations and Projects#
- A project can be in different states on different locations and for different users.
You move projects move between locations through platforms like GitHub or GitLab, so all of the typical Git based differences and resolutions for repositories on different machines exist.
- The AI Workbench management layer on a location handles host and user specific information to build the proper environment.
When a project moves between locations, AI Workbench knows what the environment should be from the versioned configuration files. However, the required container build and runtime typically depend on information that isn’t in the project repository.
For example, the host architecture (ARM vs AMD 64), source locations for storage mounts, and user dependent environment variables (e.g. API keys) can’t be stored in the repository.
AI Workbench handles this gap through host information to render the containerfile and user input to complete the runtime configuration.