Nvidia Transfer Learning REST API
The Nvidia Transfer Learning API exposes dataset and experiment endpoints for setting up and running actions.
Examples in this section are based on CURL commands, and jq JSON data processing, on a Linux machine with CURL and the jq tool pre-installed.
If using Python requests for making API calls, it is required to use Python == 3.11
User authentication is based on NGC API KEY. For more details, see the API reference.
For example:
BASE_URL=https://api-ea2.tao.ngc.nvidia.com/api/v1
NGC_API_KEY=zZYtczM5amdtdDcwNjk0cnA2bGU2bXQ3bnQ6NmQ4NjNhMDItMTdmZS00Y2QxLWI2ZjktNmE5M2YxZTc0OGyS
CREDS=$(curl -s -X POST $BASE_URL/login -d '{"ngc_api_key": "'"$NGC_API_KEY"'"}')
USER=$(echo $CREDS | jq -r '.user_id')
echo $USER
TOKEN=$(echo $CREDS | jq -r '.token')
For example, an API call for listing datasets might be:
curl -s -X GET $BASE_URL/users/$USER/datasets -H "Authorization: Bearer$TOKEN"
The Nvidia Transfer Learning API service includes methods for dealing with the content of experimental workspaces, such as user datasets and experiments. It also includes methods for executing NVTL actions applicable to data and specifications stored in experimental workspaces.
Typically, you create a dataset for a specific network type, create an experiment that is pointing to this dataset, pick a base experiment, and customize specs before executing network-related actions.
|
|
---|---|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
See the TAO Toolkit API Reference for more details.
The tutorial notebooks provide examples that demonstrate the various workflows.
Download the resource using the NGC CLI.
ngc registry resource download-version "nvidia/tao/tao-getting-started:5.3.0"
Find tutorial notebooks (
.ipynb
files).cd tao-getting-started_v5.3.0/notebooks/tao_api_starter_kit/api
Serve these Notebook examples using Jupyter-lab pods.
WarningJupyter-lab pods are not multi-tenant and present a security risk, as the user could gain access to manage the whole GPU cluster within the pod.
Also, the instructions mentioned below to forward the port and use the Jupyter-lab pod is only meant to show how to launch the TAO API notebooks across different Cloud Service Providers (CSPs) in a cloud agnostic way. Each organization will have it’s own specific security policies regarding opening cloud service ports publicly, so make sure you review them and comply before executing the below mentioned steps.
NVTL API version 5.3.0 provides a Jupyter-lab pod with the dependencies required to launch notebooks pre-installed.
On the remote machine where the one-click deployment scripts are run, execute the following command. Note that this command will not terminate automatically, therefore to run any new commands on this machine, you will need to open a new terminal.
kubectl port-forward service/nvtl-api-jupyterlab-service :8888
Example output
Forwarding from 127.0.0.1:33465 -> 8888 Forwarding from [::1]:33465 -> 8888
Copy the port number following the address
127.0.0.1
from output of previous step.In a new terminal window on your local machine, launch ssh tunneling of the remote machine via the following command. user_name and IP_address corresponds to the machine where the deployment scripts are run. Note that this command will not terminate automatically.
ssh -N -L <port_number_copied>:localhost:<port_number_copied> <user_name>@<IP_address>
On the browser of the local machine, visit the address
localhost:<port_number_copied>/notebook/lab
Now, the jupyterlab session inside the jupyter-lab pod is accessible to you on the browser.
You can run api or TAO-Client notebooks inside the tao_end2end folder.Inside the notebook, for FIXME of
ip_address
, useingress-nginx-controller
and for FIXME ofport_number
, use80
.
You can view the pod name using
kubectl get pods -n default | grep "jupyter"
.You can copy files into the pod using
kubectl cp <path to local file system> <jupyterlab-pod pod name>:<path inside jupyterlab-pod>
You can also launch the notebooks without the Jupyter-lab pod by installing the Jupyter notebook pip package and using that package to launch notebooks from the
getting started
directory: Instead of using NGC, The tutorial notebooks can also be downloaded from the machine where TAO API is deployed. You can obtain them using:wget https://<ip_of_hosted_machine>:<nginx_service_forwarded_port>/tao_api_notebooks.zip unzip tao_api_notebooks.zip cd api pip3 install jupyter notebook jupyter notebook --ip 0.0.0.0