Requirements and Installation ============================= .. _requirements_and_installation: The TLT is designed to run on x86 systems with an NVIDIA GPU (e.g., GPU-powered workstation, DGX system) or can be run in any cloud with an NVIDIA GPU. For inference, models can be deployed on any edge device such as an embedded Jetson platform or in a data center with GPUs like T4 or A100. This page lists recommended system requirements for the installation and use of the TLT. Hardware Requirements --------------------- The following system configuration is recommended to achieve reasonable training performance with the TLT and supported models provided: * 32 GB system RAM * 32 GB of GPU RAM * 8 core CPU * 1 NVIDIA GPU * 100 GB of SSD space TLT is supported on A100, V100 and RTX 30x0 GPUs. Software Requirements --------------------- In addition to the TLT package, the following software is required to take advantage of all the tutorials, examples and supported models within the containers provided: * Ubuntu 18.04 LTS * `NVIDIA GPU Cloud`_ account and API key * `docker-ce`_ * `nvidia docker2`_ .. _NVIDIA GPU Cloud: https://ngc.nvidia.com/ .. _docker-ce: https://docs.docker.com/install/linux/docker-ce/ubuntu/ .. _nvidia docker2: https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/install-guide.html .. Note:: DeepStream 5.0 - `NVIDIA SDK for IVA inference`_ is recommended. .. _NVIDIA SDK for IVA inference: https://developer.nvidia.com/deepstream-sdk .. _install_prereq: Installation Prerequisites -------------------------- Perform the following prerequisite steps before installing TLT: .. TODO: @ Update driver version for cuda 11.1 (Driver 455+) 1. Install `Docker`_. 2. Install `NVIDIA GPU driver`_ v455.xx or above. 3. Install `nvidia docker2`_ 4. Get an `NGC`_ account and API key: a. Go to NGC and click the **Transfer Learning Toolkit** container in the **Catalog** tab. This message is displayed: "Sign in to access the PULL feature of this repository". b. Enter your Email address and click **Next**, or click **Create an Account**. c. Choose your organization when prompted for **Organization/Team**. d. Click **Sign In**. 5. Execute :code:`docker login nvcr.io` from the command line and enter these login credentials: a. Username: "$oauthtoken" b. Password: "YOUR_NGC_API_KEY" .. Note:: If you have followed the default installation instructions for :code:`docker-ce` you may need to have :code:`sudo` access to run :code:`docker` commands. In order to circumvent this, TLT recommends you to follow these `post-installation steps`_ to make sure that the docker commands can be run without sudo. .. _post-installation steps: https://docs.docker.com/engine/install/linux-postinstall/ .. _Docker: https://www.docker.com/ .. _NVIDIA GPU driver: https://www.nvidia.com/Download/index.aspx?lang=en-us .. _NGC: https://ngc.nvidia.com/ Installation ------------ The Transfer Learning Toolkit (TLT) is a Python pip package that is available to download from the NVIDIA DevZone. The package uses the docker CLI internally to interact with the NGC Docker registry to download and instantiate the underlying docker containers. You must have an NGC account and an API key associated with your account. See the :ref:`Installation Prerequisites` section for details on creating an NGC account and obtaining an API key. Running the Transfer Learning Toolkit ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ The procedure to install and run the Transfer Learning Toolkit is detailed in :ref:`this section`. Use the examples **************** Example Jupyter notebooks for all the tasks that are supported in TLT are available in NGC `resources`_. TLT provides sample workflows for :ref:`Computer Vision` and :ref:`Conversational AI`. .. _resources: https://ngc.nvidia.com/catalog/resources **Computer Vision** .. _cv_samples: All the samples for the supported computer vision tasks are hosted on ngc under the `TLT Computer Vision Samples`_. To run the available examples, download this sample resource by using the following commands. .. _TLT Computer Vision Samples: https://ngc.nvidia.com/catalog/resources/nvidia:tlt_cv_samples .. code-block:: shell wget --content-disposition https://api.ngc.nvidia.com/v2/resources/nvidia/tlt_cv_samples/versions/v1.0.2/zip -O tlt_cv_samples_v1.0.2.zip unzip -u tlt_cv_samples_v1.0.2.zip -d ./tlt_cv_samples_v1.0.2 && rm -rf tlt_cv_samples_v1.0.2.zip && cd ./tlt_cv_samples_v1.0.2 **Conversational AI** .. _conv_ai_samples: The TLT Conversational AI package, provides several end to end sample workflows to train conversational AI models using TLT and subsequently deploying them to jarvis. You can find these samples at: +--------------------------------+--------------------------------------------------------------------------------------------------------+ | **Conversational AI Task** | **Jupyter Notebooks** | +================================+========================================================================================================+ | Speech to Text | `Speech to Text Notebook`_ | +--------------------------------+--------------------------------------------------------------------------------------------------------+ | Question Answering | `Question Answering Notebook`_ | +--------------------------------+--------------------------------------------------------------------------------------------------------+ | Text Classification | `Text Classification Notebook`_ | +--------------------------------+--------------------------------------------------------------------------------------------------------+ | Token Classification | `Token Classification Notebook`_ | +--------------------------------+--------------------------------------------------------------------------------------------------------+ | Punctuation and Capitalization | `Punctuation Capitalization Notebook`_ | +--------------------------------+--------------------------------------------------------------------------------------------------------+ | Intent and Slot Classification | `Intent Slot Classification Notebook`_ | +--------------------------------+--------------------------------------------------------------------------------------------------------+ .. _Speech to Text Notebook: https://ngc.nvidia.com/resources/nvidia:tlt-jarvis:speechtotext_notebook .. _Question Answering Notebook: https://ngc.nvidia.com/resources/nvidia:tlt-jarvis:questionanswering_notebook .. _Text Classification Notebook: https://ngc.nvidia.com/resources/nvidia:tlt-jarvis:textclassification_notebook .. _Token Classification Notebook: https://ngc.nvidia.com/resources/nvidia:tlt-jarvis:tokenclassification_notebook .. _Punctuation Capitalization Notebook: https://ngc.nvidia.com/resources/nvidia:tlt-jarvis:punctuationcapitalization_notebook .. _Intent Slot Classification Notebook: https://ngc.nvidia.com/resources/nvidia:tlt-jarvis:intentslotclassification_notebook You can download these resources, by using the NGC CLI command available at the NGC resource page. Once you download the respective tutorial resource, you may instantiate the jupyter notebook server. .. code-block:: bash jupyter notebook --ip 0.0.0.0 --allow-root --port 8888 Copy and paste the link produced from this command into your browser to access the notebook. The /workspace/examples folder will contain a demo notebook. Feel free to use any free port available to host the notebook if port 8888 is unavailable. Downloading the Models ^^^^^^^^^^^^^^^^^^^^^^ .. _downloading_the_models: The Transfer Learning Toolkit Docker gives you access to a repository of pretrained models that can serve as a starting point when training deep neural networks. These models are hosted on the NGC. To download the models, please download the NGC CLI and install it. More information about the NGC Catalog CLI is available `here`_. Once you have installed the CLI, you may follow the instructions below to configure the NGC CLI and download the models. .. _here: https://docs.nvidia.com/ngc/ngc-catalog-cli-user-guide/index.html" Configure the NGC API key ************************* Using the NGC API Key obtained in :ref:`Installation Prerequisites`, configure the enclosed ngc cli by executing this command and following the prompts: .. code-block:: bash ngc config set Get a list of models ******************** Use this command to get a list of models that are hosted in the NGC model registry: .. code-block:: bash ngc registry model list For the computer vision models, here is an example of using this command: .. code-block:: bash ngc registry model list nvidia/tlt_pretrained_* .. Note:: All our classification models have names based on this template: ``nvidia/tlt_pretrained_classification: