Use Multi-Container Environments#
Overview#
- You can create and and use multi-container applications in AI Workbench using Docker Compose.
Multi-container environments let you run multiple services together. Common use cases include RAG pipelines, and microservices architectures.
- You can manage multi-container environments in the Desktop App without any terminal commands.
AI Workbench has controls for starting, stopping, and monitoring services. Select profiles to run different service configurations from a single compose file.
- Compose services run alongside your project container with shared networking.
Services communicate using service names as hostnames. Web services can be proxied through AI Workbench for easy access.
Key Concepts#
- Compose File
A docker-compose.yml or compose.yaml file that specifies services, networks, and volumes.
- Service
An individual container defined in the compose file.
- Profile
A tag that enables conditional service activation based on selected profiles.
- NVWB_TRIM_PREFIX
An environment variable that enables AI Workbench proxy integration for web services.
Configure Compose File Location#
- Step One: Open your project spec.yaml file.
Select Project Tab > Files > .project > spec.yaml
Select Option Dots > Edit
- Step Two: Set the compose file path.
Locate the
environmentsectionSet
compose_file_pathto your compose file location:environment: base: # ... other base configuration ... compose_file_path: compose.yaml
Use a path relative to the project root
Common locations:
compose.yaml,docker-compose.yml,deploy/compose.yamlSelect Save
- Step Three: Create your compose file at the specified location.
The compose file must exist at the path you specified
Use the procedures below to create and edit the compose file
Success: AI Workbench will display compose controls in the Project Tab.
The compose file path can be empty if not using compose
Leave compose_file_path as an empty string (“”) if your project doesn’t use multi-container environments.
AI Workbench will not show compose controls in the Desktop App.
Create a Compose File#
- Step One: Open the compose section.
Select Project Tab > Environment > Compose
Select Create compose file
- Step Two: Write your compose configuration.
The editor opens with a template
Define your services, networks, and volumes
Use the built-in Cheat Sheet for syntax reference
Select Save when complete
Success: The compose file is created in your project repository.
Start with a simple example
Begin with a single service to verify your compose setup works. Add additional services incrementally. See Multi-Container Environments (Docker Compose) for complete examples.
Edit a Compose File#
You can edit the compose file through the Desktop App or directly in your repository.
Using the Desktop App#
- Step One: Open the compose editor.
Select Project Tab > Environment > Compose
Select Edit compose file
- Step Two: Make your changes.
Edit the compose configuration in the editor
Select Save
- Step Three: Restart services to apply changes.
Stop the compose environment if running
Start the compose environment with updated configuration
Success: Changes are saved and will take effect on next start.
Using a File Editor#
- Step One: Open the compose file.
Select Project Tab > Files > compose.yaml
Select Option Dots > Edit
Or open the file in VS Code, JupyterLab, or another editor
- Step Two: Make your changes.
Edit the compose configuration
Save the file
- Step Three: Restart services to apply changes.
Stop the compose environment if running in AI Workbench
Start the compose environment with updated configuration
Success: Changes are saved and will take effect on next start.
Start a Multi-Container Environment#
- Step One: Select profiles if your compose file uses them.
Select Project Tab > Environment > Compose
Select one or more profiles from the Profile dropdown
If no profiles are defined, skip this step
- Step Two: Start the compose environment.
Select Start
AI Workbench runs
docker compose upwith selected profiles
- Step Three: Monitor service startup.
Watch the Compose Output section for logs
Services appear in the compose section as they start
Healthchecks must pass for services that define them
Success: Services start and appear in the compose section showing their status.
Web services with NVWB_TRIM_PREFIX become accessible through the proxy
Services with NVWB_TRIM_PREFIX: "true" in their environment variables are proxied by AI Workbench.
Access URLs appear on the service cards in the Desktop App.
Stop a Multi-Container Environment#
- Step One: Open the compose section.
Select Project Tab > Environment > Compose
- Step Two: Stop the compose environment.
Select Stop
AI Workbench runs
docker compose down
Success: All services stop and the compose section shows services as stopped.
Configure GPU Access for Services#
Compose services can use GPUs by specifying resource requirements.
Using deploy.resources Syntax#
Add GPU configuration to your service definition:
services: nim-llm: image: nvcr.io/nim/meta/llama-3.1-8b-instruct:latest runtime: nvidia deploy: resources: reservations: devices: - driver: nvidia count: 1 capabilities: [gpu]
runtime: nvidiaenables GPU access
count: 1requests one GPUThe first available GPU will be assigned
Using device_ids for Specific GPUs#
Assign specific GPU devices to services:
services: embedding-service: image: nvcr.io/nim/nvidia/llama-3.2-nv-embedqa-1b-v2:latest runtime: nvidia deploy: resources: reservations: devices: - driver: nvidia device_ids: ['0'] capabilities: [gpu] llm-service: image: nvcr.io/nim/meta/llama-3.1-70b-instruct:latest runtime: nvidia deploy: resources: reservations: devices: - driver: nvidia device_ids: ['1', '2'] capabilities: [gpu]
device_ids: ['0']assigns GPU 0
device_ids: ['1', '2']assigns GPUs 1 and 2Prevents services from competing for the same GPU
Check GPU availability on your system
Use nvidia-smi in a terminal to see available GPUs and their IDs.
Ensure the project container is not using GPUs you want to assign to compose services.
Use Profiles for Service Variants#
Profiles enable running different service configurations from one compose file.
Define Profiles in Services#
Add profile tags to service definitions:
services: llama-8b: image: nvcr.io/nim/meta/llama-3.1-8b-instruct:latest runtime: nvidia deploy: resources: reservations: devices: - driver: nvidia count: 1 capabilities: [gpu] ports: - "8000:8000" profiles: - small-model llama-70b: image: nvcr.io/nim/meta/llama-3.1-70b-instruct:latest runtime: nvidia deploy: resources: reservations: devices: - driver: nvidia count: 2 capabilities: [gpu] ports: - "8000:8000" profiles: - large-model
Only services matching the selected profile start
Services without profiles always start
Use profiles for model variants, deployment modes, or optional components
Select Profiles in AI Workbench#
- Step One: Open the compose section.
Select Project Tab > Environment > Compose
- Step Two: Select desired profiles.
Select profiles from the Profile dropdown
Multiple profiles can be selected if your compose file supports it
- Step Three: Start the compose environment.
Select Start
Only services tagged with selected profiles will start
Success: Services matching selected profiles start.
Integrate Web Services with AI Workbench#
Set NVWB_TRIM_PREFIX to proxy web services through AI Workbench.
Add the environment variable to web services:
services: frontend: image: myapp/frontend:latest ports: - "3000:3000" environment: - NVWB_TRIM_PREFIX=true
AI Workbench proxies services with this variable set to “true”
Access URLs appear in the Desktop App
Works for both local and remote locations
Success: The service is accessible through AI Workbench’s proxy with a URL shown in the Desktop App.
Not all services need NVWB_TRIM_PREFIX
Only web services that users access through a browser need proxying. Backend APIs, databases, and internal services should not set this variable.
Version Compose Configurations#
- Step One: Commit the compose file.
Select Project Tab > Git > Changes
Select the compose file to include in the commit
Select Commit
- Step Two: Push changes to the remote repository.
Select Push to sync with the remote
Success: The compose file is versioned with your project.
Container images are not versioned
The compose file is versioned, but the container images used for services are not. Image versions are determined by tags in the compose file. Teammates need access to the same registries to pull images.
Troubleshooting#
- Compose controls not visible in Desktop App
Verify
compose_file_pathis set in spec.yamlEnsure the compose file exists at the specified path
Check that AI Workbench is using Docker runtime (not Podman)
- Services fail to start
Check compose output logs for error messages
Verify all required images are accessible
Ensure ports are not already in use
Check that GPUs are available if services request them
- GPU allocation errors
Verify GPU IDs with
nvidia-smiEnsure the project container is not using requested GPUs
Check that multiple services are not requesting the same specific GPU
Try using
countinstead ofdevice_idsto let Docker assign GPUs
- Services cannot communicate
Ensure services are on the same network
Use service names as hostnames for inter-service communication
Check that dependent services have started and passed healthchecks
- Web services not accessible
Verify
NVWB_TRIM_PREFIXis set to “true” exactlyCheck that the service is exposing the correct port
Ensure the service has started successfully