Frequently Asked Questions for AI Workbench Integrations#
Does Configuring an Integration Give NVIDIA Access to My Credentials or Accounts?#
No. NVIDIA does not get any access to your credentials or accounts.
Configuring an integration just provides the Desktop App installed on your laptop with the necessary permissions to authenticate to the given platform or service.
In addition, the Desktop App does not have any telemetry of any kind, so there is no way that any of your information would be passed back to NVIDIA.
You can revoke this access at anytime by disconnecting the integration.
What Kind of Access Workbench Actually Need to my GitHub or GitLab Resources in Order to Manage Projects?#
AI Workbench needs full API access to manage repositories on your behalf, including creating, modifying, and pushing changes to both public and private repositories.
If I use a PAT or API Key as the Authentication Mechanism, How Does AI Workbench Store It?#
AI Workbench stores and uses credentials for integrations when connected. By default, all credentials are stored and encrypted at rest using the host system’s secret storage that is tied to your login.
On macOS, credentials are stored in the Keychain. On Windows, they are stored using the Windows Credentials API. On Linux, credentials are stored using the dbus secret service, which is gnome-keyring by default with Ubuntu.
This ensures that your credentials are secure and protected, while still being easily accessible when needed.
How Does Workbench Pass Credentials to a Location?#
When you start a location using the desktop app or CLI, the associated Workbench service is started and your credentials are pushed into memory for use by the service.
Depending on your container runtime, AI Workbench may write a temporary file or act as a credential helper to provide credentials to the runtime. If written to a file, the file is removed at service stop.
When using Git with AI Workbench, your credentials are provided to Git through a custom credential helper. This keeps your credentials safe and secure, without the need to store them on the file system.
Can I Use the CLI to Connect an Integration?#
Yes. This has two steps. First you create the integration, and then you connect it to trigger the authentication flow.
Run the following command to start the interactive integration setup:
nvwb create integration
Select the particular integration you want to connect and hit Enter.
Enter the required details as prompted.
Once setup is complete, AI Workbench lists all configured integrations. Verify that your new integration appears.
To connect to the integration, run:
nvwb connect integration <integration name>
How Do I Disconnect an Integration?#
Open AI Workbench Settings (:ref:settings-integrations).
Select the Integrations page.
Click Disconnect on the integration card you want to remove
Do I Need an NGC API Key to Use Workbench?#
No. The default containers used for project creation are public, so you don’t need an API key to use them.
However, you do need an NGC API Key if you want to use the following from build.nvidia.com:
Use the free endpoints for models like Llama 3.1 405B instruct.
Use NVIDIA Blueprints like PDF-to-Podcast.
What’s the difference between using an NGC API Key to access private container on NGC and using it to access endpoints on build.nvidia.com?#
Configuring the NGC API key in the Workbench Settings page makes it available to use with any location or project that pulls the base image from a private registry on NGC.
It does not configure acess to any API endpoints on build.nvidia.com.
However, the personal key does also grant access to the endpoints on build.nvidia.com, but you must configure this on a project-by-project basis as a sensitive environment variable.
See Manage Runtime Settings for more information on how to configure environment variables.