Sensor Calibration

The goal of the Manual Extrinsic Calibration process is to provide the position and orientation of a set of sensors in the robot frame. This process includes the calibration of one sensor (e.g. LIDAR) with respect to the robot frame and multiple calibrations of sets of two or more sensors (e.g. camera-LIDAR or camera-camera) with overlapping fields of view.

The Isaac 2.0 manual calibration supports the following calibration options:

  • 3D LIDAR - stereo camera

  • 3D LIDAR - depth camera

  • 3D LIDAR - ground

Manual Extrinsic Calibration requires static scenes, which you can collect with the Record and Upload tool.

The manual extrinsic calibration process involves the following steps (outlined in the image below):

  1. Record static scenes as detailed in the sections below.

  2. Contact NVDIA after uploading the data recordings and request the isaac_calibration.json file.

  3. After receiving the isaac_calibration.json file, deploy it to the corresponding robot in the following location: /etc/nova/calibration/isaac_calibration.json.

    user_journey_calibration_v2.png


Note

Calibration files are validated by NVIDIA; there is no need to perform additional validation of the calibration files apart from deploying them.


Prerequisites

Before using Manual Extrinsic Calibration, you need to launch the Recorder app on your robot. When launching the Recorder app, modify the --param argument as follows:

Copy
Copied!
            

--param=uploader.s3_uploader/s3_uploader/bucket=<YOUR_S3_BUCKET>

Note

Replace <YOUR_S3_BUCKET> with your AWS S3 bucket.

Note

Calibration recordings must be uploaded to the root of <YOUR_S3_BUCKET>, without any additional subfolders.


Recording Data

  1. After launching the Record and Upload tool, fill in the following fields:

    • Title: manual_calibration_<platform-id> (e.g. manual_calibration_carter-v23-11)

    • Tags: manual_calibration

    data-recorder-frontend-manual-calibration.png


  2. Drive the robot to six different scenes, as described in the Static Scenes section below. At each of the scenes, take a 1-second recording by pressing Start and Stop in the recording frontend. The outcome of this step are six 1-second recordings as .pod files.

  3. Upload the recorded .pod files using the Upload tab of the Record and Upload Tool. We recommend connecting the robot to a wired network via Ethernet cable before uploading the data.

Note

If the calibration file is not present in /etc/nova/calibration/isaac_calibration.json, an ERROR message will be printed in the console.


Static Scene Guidelines

The Manual Extrinsic Calibration does not require calibration boards. Instead, it uses recordings of static and structured scenes in flat ground similar to the example scene below:

example_scene_1.jpg

The requirements for each recording include the following:

  • A static and short (1-second) scene without any movement of the robot or objects around it

  • Flat ground

  • Empty space in a circle of 2.5m (9 feet) radius around the robot.

  • Multiple structured objects with sharp edges at different distances and locations in the image, in the areas that have LIDAR coverage (refer to the Good Examples below). There should be large distance variations between the foreground objects and their background.

  • Avoid cluttered scenes and objects with round edges (refer to the Bad Examples examples below), and minimize the number of sources of direct light, as well as specifically black foreground objects in the scene (dark objects are acceptable).

Examples of good geometries for the static scenes include hand rails, thin poles, flat planes like doors or cubicle walls, columns, bridges, and low-hanging ceilings or table tops with visible edging with the foreground:

good_geometries_1.png

good_geometries_2.png

good_geometries_6.png

good_geometries_3.png

good_geometries_4.png

good_geometries_5.png

Examples of bad geometries, which you should avoid in the static-scene recordings, include objects with round edges, like chairs and round poles, and cluttered scenes where it is difficult to find the edges of the foreground objects with different sensor modalities:

bad_geometries_1.png

bad_geometries_2.png

bad_geometries_3.png

Planned Improvements

The Isaac 2.5 release is planned to support automated extrinsic calibration for 3D LIDAR and cameras in the carter robot. Isaac 2.5 will include a live data collection tool for calibration: An operator will be required to position a calibration board in certain locations in front of the robot sensors. The recording that includes the desired board positions will be used to generate a calibration file for the set of sensors.

© Copyright 2018-2023, NVIDIA Corporation. Last updated on Dec 8, 2023.