Cloud Quickstart - Azure

This guide aims to get you up and running with a cloud quickstart of the Multi-Camera Sim2Deploy workflow, consisting of the Simulation & Synthetic Data Generation (SDG) workflow and the Real-Time Location System (RTLS) workflow.

Prerequisites

  • Sign up for Developer Preview of Metropolis Microservices. Review here.

  • Refer to the pre-requisite to obtain the required access and information for using the automated deployment scripts.

  • For best performance, we recommend using A100/H100 GPUs for RTLS and L40S GPUs for SDG. However, for this guide, we will use Azure A10G-based instances:

    • For the SDG workflow: 2x A10G GPU instance type - Standard_NV72ads_A10_v5

    • For the RTLS workflow: 4x A100 GPU instance type - Standard_NC96ads_A100_v4

Architecture Overview

  • Deployment artifacts, via infrastructure as code, provision two cloud instances in a single Azure VirtualNetwork(VNet) in Azure, instance share azure blob storage using BlobFuse for utilizing SDG generated data in RTLS for analytics.

  • The SDG instance comes pre-loaded with a simple warehouse digital twin containing:

    • 12 cameras

    • 8 digital human (DH) characters

  • Shared storage is pre-populated with synthetic videos & calibration data generated from the digital twin:

    • 12 10-minute videos

    • Calibration data

  • The RTLS instance comes pre-configured with the videos and calibration data from the shared storage.

  • Users can explore the UI & API endpoints of the SDG and RTLS workflows.

  • Users can use the SDG workflow to modify the digital twin, export new synthetic videos to shared storage, and follow instructions to load them in the RTLS instance.

  • While this guide provides the steps to get up and running, you can check out the above references to get more from your deployment.

The diagram below shows an example deployment architecture with the SDG workflow running on an L40-based instance and the RTLS workflow running on an H100-based instance:

Multi-Camera Sim2Deploy workflow on Azure

Set Up the Workflows in the Cloud

RTLS & SDG Workflow

  1. Download and un-tar the deployment artifact from NGC and locate the subfolder that was un-tarred.

    # download the artifact
    $ ngc registry resource download-version nfgnkvuikvjm/mdx-v2-0/metropolis-azure-nv-one-click-script:0.0.1
    
    
    ## Untar deploy script artifact
    $ cd metropolis-azure-nv-one-click-script_v0.0.1/
    $ tar -xvf deploy-mdx-azure-cns.tar.gz
    
    
    # verify necessary files required for Installing Infra on Azure CSP
    $ ls
    deploy-mdx-azure-cns.tar.gz  dist
    $ cd dist
    
    
    $ ls
    ansible-requirements.yml  config-files         deploy-template.yml  envbuild.sh  modules    README.md
    cns                       config.yml  envaccess.sh         iac-ref      playbooks   setup-system.sh
    $
    
  2. Edit config.yml file (see samples in Appendix).

  3. Run the following commands to deploy the Metropolis RTLS and SDG app:

    Important

    • To run envbuild.sh script seamlessly, user can run setup system script using command bash setup-system.sh. Script sets up the source system from where deploy script will be triggerred.

    • Script configures system with following packages - python (version - 3.9), jq(latest version available), yq(version - v4.41.1), NGC CLI(latest version available), Terraform(version - 1.6.6), Ansible (version - 2.14.8).

    # To view available options
    $ ./envbuild.sh or  ./envbuild.sh --help
    
    # To preview changes based on config.yml without actually applying the changes (-d flag will do dry run)
    $ ./envbuild.sh install -c all -d 
    
    # To install changes showed in preview option based on deploy-template.yml
    $ ./envbuild.sh install -c all
    
    # To make changes only part of deloyment, user can use skip flags(supported skip details in --help options)
    $ $ ./envbuild.sh install -c all -i (This option will skip the changes targeted for infra related changes using envbuild script, similarly multiple flags can be at the same time)
    
    # To uninstall the deployed infra and application
    $ ./envbuild.sh uninstall -c all
    
    1. Note down the generated output details for access to the SDG VM instance UI, including NoMachine credentials.

Explore RTLS Workflow

The RTLS workflow is pre-configured with sample multi-camera videos & calibration data, generated from the sample warehouse digital twin, to be explored in the next section.

Note

Although we leverage synthetic data as part of the Multi-Camera Sim2Deploy overall workflow, the RTLS workflow is agnostic to whether the input multi-camera data is real or synthetic.

  1. Refer to the Deployment guide verify deployment section to navigate to the RTLS UI. RTLS UI will look as follows:

Sim2Deploy RTLS Reference App UI

The main window displays the floor plan map of the analyzed space. Each dot moving on the map indicates the location of each globally identified unique object. Each object is labeled with its global id and the live motions are marked with colored trajectories. Camera icons are shown on the map as well to indicate the location and orientation of all the used cameras. The field of view of each camera can be viewed by hovering over the corresponding camera icon. At the bottom of the UI the total number of unique detected objects at the moment will be displayed.

Note

AMR count shown on RTLS UI is 0 since there is no active AMR data flowing in the default synthetic videos.

For in-depth documentation of the UI, refer to the Real Time Location System UI section.

  1. Kibana is also brought up as part of the RTLS app 1-click script. From this sample screen capture, you can view the mdx-rtls data which contains the location info of each globally identified unique object in real-time.

Sim2Deploy RTLS Reference App Kibana

Kibana is a powerful tool to visualize data. You can create other index patterns on existing data, e.g. mdx-raw, or create dashboards. You can read more about Kibana in its official documentation. Kibana will be available at http://<alb_dns_name>:31560.

Explore SDG Workflow

Important

Please refer to the video tutorial for SDG data generation available here. This tutorial provides a walkthrough of the Isaac Sim UI.

The Simulation & SDG workflow is pre-loaded with a sample warehouse digital twin, including DH characters & cameras.

  1. Connect to the SDG workflow using NoMachine remote desktop software. The NoMachine private key file should be in the /<deploy artifact directory>/dist/ssh.pem location.

  2. Before starting with synthetic data generation, we recommend users familiarize themselves with the Isaac Sim application by going through What is Isaac Sim? and the Isaac Sim Interface Guide.

  3. After getting familiar with Isaac Sim, you can follow the Omni.Replicator.Agent guide to get started with synthetic data generation.

    1. Users can follow the steps at Enable Omni.Replicator.Agent and Getting Started sections to generate custom synthetic data. For this quick start guide, we recommend using the defaults below.

    ORA Defaults First Part ORA Defaults Second Part
    1. We can create a new simulation by changing the seed in the UI and generating new commands for the characters by clicking Generate Random Commands followed by Save Commands.

    2. Before starting data generation we recommend that the user adjust the camera placement for better coverage. The basics of camera navigation can be found here Viewport Navigation.

    3. Data generation can take upwards of 1 hour to complete. To end the data generation early, click Stop in the editor menu on the left panel of the app. After data generation complete the Stop icon will disappear and the Play button will be visible again. For good results, we do not recommend stopping the data generation prematurely.

  4. We provide a handy post-processing script for generating videos files from the generated data required for RTLS app analytics usage. Execute the following steps:

    1. Open a new terminal in the NoMachine UI.

    2. Run the command below to enter the Isaac Sim docker:

    docker exec -it isaacsim bash
    
    1. Next, from inside the Isaac Sim container, run the commands below to generate videos from the images:

    ./python.sh /uploads/post_processing.py -bu -sd /isaac-sim/ReplicatorResult/
    

    Note

    /isaac-sim/ReplicatorResult/ directory can be easily configurable to the mounted azure blob storage path while generating the SDG data using IsaacSIM. To change the path output, user can set the custom path under Agent SDG tab of IsaacSIM UI in the parameters section for the parameter output_dir.

  5. While generating SDG data make sure to point data generation to be path of Azure Blob Container storage path which is mounted on IsaacSim Docker on path /uploads/sdg-data/.

    1. For Video Data - /uploads/sdg-data/<folder-name-based-on-deploy-name>/<videodata-folder-name>

    2. For Calibration files - /uploads/sdg-data/<folder-name-based-on-deploy-name>/<calib-data-folder-name>

Note

<folder-name-based-on-deploy-name> can be easily found by exec into IsaacSim Docker running on IsaacSim VM(ssh command is in verify deployment section).

  1. Next follow the Camera Calibration Guide to generate the calibration file, top view image, and the image metadata that the RTLS application needs to use the generated data.

    1. In the Calibration toolkit UI enter the values below to generate calibration artifacts for the RTLS app.

    Camera Calibration Defaults
    1. Follow the steps in the Camera Calibration Guide to generate the calibration.json and the top view image.

    2. Next generate the image metadata file by running the script below. This should create a imageMetadata.json file in the uploads/sdg-data azure blob mounted directory.

    ./python.sh /uploads/create_image_metadata.py -c /uploads/calibration.json -d /uploads/sdg-data/<folder-name-based-on-deploy-name>/<calib-data-folder-name>
    

    Note

    <folder-name-based-on-deploy-name> can be easily found by exec into IsaacSim Docker running on IsaacSim VM(ssh command is in verify deployment section).

Use Generated Data in RTLS Workflow

  1. Steps to load the newly generated videos into the NVStreamer microservice of the RTLS workflow:

    1. Create a config file on the RTLS Workflow VM for the configs parameters required to run the script load-sdg-data-to-nvstreamer-vst.sh:

    $ cat << EOF >> /mnt/sdg-data/config.json
    {
    "NGC_CLI_API_KEY": "<your-ngc-key>",
    "blob_mount_sdg_data_folder": "<videodata-folder-name>",
    "blob_mount_sdg_calibration_folder": "<calib-data-folder-name>",
    "nvstreamer_chart_url": "https://helm.ngc.nvidia.com/rxczgrvsg8nx/vst-1-0/charts/nvstreamer-0.2.32.tgz",
    "vst_chart_url": "https://helm.ngc.nvidia.com/rxczgrvsg8nx/vst-1-0/charts/vst-1.0.30.tgz",
    "ds_chart_url": "https://helm.ngc.nvidia.com/nfgnkvuikvjm/mdx-v2-0/charts/mdx-wdm-ds-app-0.0.33.tgz"
    }
    EOF
    
    1. Update the config file /mnt/sdg-data/config.json created in the above step on the RTLS Workflow VM with the required values.

    $ sudo vi /mnt/sdg-data/config.json
    

    An explanation of each parameter is provided below for correctly filling in details for loading data using the script in the next step.

    • NGC_CLI_API_KEY - NGC API key needs to be populated.

    • blob_mount_sdg_data_folder - Azure Blob video data folder for new sdg data (e.g., <videodata-folder-name>).

    • blob_mount_sdg_calibration_folder - Azure Blob calibration & imagemeta data folder generated using sdg (e.g., <calib-data-folder-name>).

    • nvstreamer_chart_url / vst_chart_url - NvStreamer and VST Chart versions are also set to the default supported version for the current Metropolis Microservices release (Deploy NVStreamer and VST Microservices).

    • ds_chart_url - WDM_DS Chart version, defaults to mdx-wdm-ds-app-0.0.33.tgz, Supported version for the current Metropolis Microservices release (Deploy Perception (WDM-DeepStream) Microservice).

    1. Once the parameters are updated, execute the script using the command below:

    $ sudo bash /mnt/sdg-data/load-sdg-data-to-nvstreamer-vst.sh
    
  2. Steps to upload calibration data (only if you have changed camera locations or added/deleted cameras):

    • Assets upload for RTLS App UI is done using script /mnt/sdg-data/load-sdg-data-to-nvstreamer-vst.sh which was executed in previous step.

    Note

    • Upload will be required only if a new batch of data is generated from a modified digital twin. After uploading newly generated files from SDG if App UI is still not showing data, please troubleshoot further with steps from Why am I not seeing data on the Kibana/reference application’s UI ?

    • The default scenario has pre-loaded calibration.json and floorplan & imagesMetadata.json files.

Note

Once application workflow testing is completed, the infrastructure can be shut down by following the teardown steps from here.

Appendix

  • The appendix can be found here.