Getting Set up

This tutorial will get you set up for running Isaac apps on a Carter robot and x86 machine.

  1. Set up an account at https://catalog.ngc.nvidia.com/

  2. Contact your NVIDIA admin to be added to the EA org for Isaac 2.0 (ea-isaac).

  3. After you are granted access, you will receive an invite link in an Email to confirm membership in the EA org

  4. Obtain an API Key, which will be necessary to download the docker containers from NGC.

NOVA Init is a debian package that sets up the sensors and Jetson device. To check if it’s installed, run the following:

Copy
Copied!
            

dpkg -l nova-init

If it’s not installed, it can be downloaded and installed with the NGC client:

  1. Check if the NGC client is installed.

    Copy
    Copied!
                

    which ngc

  2. If the NGC client is not installed, download and install it using the following commands:

    Copy
    Copied!
                

    wget --content-disposition https://ngc.nvidia.com/downloads/ngccli_arm64.zip && unzip ngccli_arm64.zip && chmod u+x ngc-cli/ngc s="export PATH=\"\$PATH:$(pwd)/ngc-cli\""; f="$HOME/.bashrc"; grep -qxF "$s" $f || echo "$s" | tee -a $f && source $f

    Note

    Official documentation can be found at https://ngc.nvidia.com/setup/installers/cli under “ARM64 Linux”; however, the commands listed there should not be used in this case.

    1. Configure the NGC client to give access to Isaac resources. Use the API key that was generated in the Setting up an NGC Account and Getting the API Key section to run the following:

      Copy
      Copied!
                  

      ngc config set

  3. Download NOVA Init from NGC using the following command:

    Copy
    Copied!
                

    ngc registry resource download-version "mfql6xnjuziw/nova_init"

  4. Once the nova-init Debian package is downloaded, change directory to the download folder and install it:

    Copy
    Copied!
                

    cd $(ls -td nova_init_v* | head -1) sudo apt install ./nova-init_*_arm64.deb

Connecting the Robot to your WiFi Network

Follow these steps to connect the Carter robot to the local WiFi network:

  1. Connect the robot to a display using the HDMI port on the rear interface panel (refer to the Robot Components section for the HDMI port location).

  2. Connect the robot to wired Ethernet Port (using RJ45 connector) on the rear interface panel

  3. Connect the keyboard with touchpad to the rear interface panel.

  4. Log in to the robot (Contact NVIDIA for the robot password).

  5. Follow the WiFi instructions for Ubuntu.

  6. Once WiFi is configured, disconnect the HDMI Monitor and Keyboard.

Connecting the PC to the Robot

To control, deploy, launch, and debug applications, you have to connect to the robot.

First, you must get the IP address of the robot. Assuming you have the robot connected to a display and a keyboard with touchpad from the previous section, follow these instructions:

  1. Use CTRL+ALT+t to open a terminal.

  2. Find the IP address of the WiFi by running ifconfig wlan0. You should see an output similar to below. The IP address is highlighted with a red rectangle.

connect_robot_terminal.png

  1. Record the IP address, in this case it is 10.110.66.127. This IP

    address will be used for connecting in all further tutorials and may need to be retrieved again if it changes.

  2. SSH into the robot:

    Copy
    Copied!
                

    bob@jetson:~/isaac$ ssh ROBOTUSER@ROBOT_IP

    1. Use a computer that is connected to the same WiFi network as your robot

    2. Assuming you are on a Linux computer, open a terminal with `CTRL+ALT+t` and run `ssh <USER>@<ROBOT_IP>`. In this case, it would be `ssh nvidia@10.110.66.127`

    3. Enter the password to log in to the robot (contact NVIDIA for the robot password). The tutorials/tools in the following sections will be run from this SSH connection.

    Note

    The login will timeout for security, and the HDMI will turn off after a while due to inactivity; so retry if the screen goes black.

Connecting the Joystick

A joystick controller is shipped paired with the robot. To connect the paired joystick to Carter, press the PS button on the PS5 controller shown below.

connect_joystick.png

Note

Once the controller is connected, the LEDs on the controller should stay solid blue; flashing blue LEDs indicate the controller is trying to connect.

Note

If the controller LEDs continue flashing, the controller is attempting to connect to a paired device. If the controller is not paired with the robot, the blue LEDs will blink for a minute and stop. The easiest way to pair the controller again is by connecting it to the robot by cable. Use the details in this section to pair the controller.


The goal of the Manual Extrinsic Calibration process is to provide the position and orientation of a set of sensors in the robot frame. This process includes the calibration of one sensor (e.g. LIDAR) with respect to the robot frame and multiple calibrations of sets of two or more sensors (e.g. camera-LIDAR or camera-camera) with overlapping fields of view.

The Isaac 2.0 manual calibration supports the following calibration options:

  • 3D LIDAR - stereo camera

  • 3D LIDAR - depth camera

  • 3D LIDAR - ground

Manual Extrinsic Calibration requires static scenes, which you will collect with the Record and Upload tool.

Prerequisites

Before using Manual Extrinsic Calibration, you need to have the following applications running on your robot:

Recording Data

  1. After launching the Record and Upload tool, fill in the following fields:

    • Title: manual_calibration_<platform-id> (e.g. manual_calibration_carter-v23-11)

    • Tags: manual_calibration

    data-recorder-frontend-manual-calibration.png


  2. Drive the robot to six different scenes, as described in the Static Scenes section below. At each of the scenes, take a 1-second recording by pressing Start and Stop in the recording frontend. The outcome of this step are six 1-second recordings as .pod files.

  3. If you are calibrating the front and back cameras on the robot, repeat Step 2 while pointing the cameras on the back of the robot to similar scenes. The outcome of this step are six additional 1-second recordings as .pod files.

  4. Upload the recorded .pod files using the Upload tab of the Record and Upload Tool. We recommend connecting the robot to a wired network via Ethernet cable before uploading the data.

Static Scene Guidelines

The Manual Extrinsic Calibration does not require calibration boards. Instead, it uses recordings of static and structured scenes in flat ground similar to the example scene below:

example_scene_1.jpg

The requirements for each recording include the following:

  • A static and short (1-second) scene without any movement of the robot or objects around it

  • Flat ground

  • Empty space in a circle of 2.5m (9 feet) radius around the robot.

  • Multiple structured objects with sharp edges at different distances and locations in the image, in the areas that have LIDAR coverage (refer to the Good Examples below). There should be large distance variations between the foreground objects and their background.

  • Avoid cluttered scenes and objects with round edges (refer to the Bad Examples examples below), and minimize the number of sources of direct light, as well as specifically black foreground objects in the scene (dark objects are acceptable).

Good Examples

Examples of good geometries for the static scenes include hand rails, thin poles, flat planes like doors or cubicle walls, columns, bridges, and low-hanging ceilings or table tops with visible edging with the foreground:

good_geometries_1.png

good_geometries_2.png

good_geometries_6.png

good_geometries_3.png

good_geometries_4.png

good_geometries_5.png


Bad Examples

Examples of bad geometries, which you should avoid in the static-scene recordings, include objects with round edges, like chairs and round poles, and cluttered scenes where it is difficult to find the edges of the foreground objects with different sensor modalities:

bad_geometries_1.png

bad_geometries_2.png

bad_geometries_3.png

Planned Improvements

The Isaac 2.5 release is planned to support automated extrinsic calibration for 3D LIDAR and cameras in the carter robot. Isaac 2.5 will include a live data collection tool for calibration: An operator will be required to position a calibration board in certain locations in front of the robot sensors. The recording that includes the desired board positions will be used to generate a calibration file for the set of sensors.

The Isaac joystick application allows you to test out the robot’s movement with remote controls. It is provided as a Docker conatiner.

Run the following command on the robot to pull the Docker container and try the joystick application:

Copy
Copied!
            

docker run -it --gpus all --rm --network=host --privileged \ -v /dev:/dev \ -v /sys:/sys \ nvcr.io/<your_staging_area>/robot_remote_control_segway:isaac_2.0

Note

Depending on your installation of Docker, you may have to use sudo with the command above.

Note

Replace <your_staging_area> with your assigned NGC Registry staging area.

You should hear the Carter beep, indicating that you can now move it around. To move the robot, press and hold the L1 button and use the thumbsticks to navigate manually. The robot is configured to move at a maximum speed of 1.1m/s to ensure safe operation.

joystick_app.png

To stop the app, Press Ctrl+C on the PC.

Note

Exercise caution using the controller. Test it first in an enclosed space before taking the robot into an open space.

After testing the Joystick application, you are ready to try out various use cases with this robot.

Isaac 2.0 features multiple important use cases:

  1. Map Creation

  2. Autonomous Navigation

The tutorial sections will walk you through these two workflows.

© Copyright 2018-2023, NVIDIA Corporation. Last updated on Oct 30, 2023.