Evidence Grid Map DNN

Evidence Grid Map (EGM) is an intermediate grid-based representation that fuses data from different sensors and generate a unified map. Dempster-Shafer Theory is used to construct an occupancy grid around the robot, where each cell contains three floats: the belief mass that a cell is free, the belief mass that a cell is occupied, and the leftover mass that is assigned as uncertain. Each sensor message is parsed into an evidence grid map, and the individual grid maps are combined to construct a unified Evidence Grid Map of the environment.

The core of the EGM DNN is a robust, accurate, real-time, vision-based DNN that creates a dense, top-down representation of obstacles around a robot. The module contains the Shako DNN architecture, which is inspired by previous USS-based EGM DNN networks designed for vision-based navigation tasks.

Follow the steps in the Mapping Tutorial to use the Data Recorder to generate a POD recording from the robot. Use the following commands to run data extraction from a POD recording:

Copy
Copied!
            

cd isaac/sdk/ && bash extensions/egm_fusion/apps/dataset_from_pod_extractor.sh $POD_FILE $OUTPUT_PATH

  • $POD_FILE is the absolute path to the POD recording

  • $OUTPUT_PATH is the absolute path to extracted data directory

Based on images extracted above, use the following commands to run EGM Shako DNN inference on an x86 machine with dGPU or an Orin with iGPU:

Copy
Copied!
            

cd isaac/sdk/ && dazel run //extensions/egm_fusion/apps/shako_inference -- --param=image_left_loader.image_reader/reader/image_path=$LEFT_IMAGE_PATH --param=image_right_loader.image_reader/reader/image_path=$RIGHT_IMAGE_PATH --param=egm_post_processor.egm_writer/writer/output_file_base_path=$EGM_DIR

  • $LEFT_IMAGE_PATH is the absolute path to the extracted left image

  • $RIGHT_IMAGE_PATH is the absolute path to the extracted right image

  • $EGM_DIR is the absolute path to EGM inference results from EGM DNN Shako

Note

When running the above applications for the first time, TRT plan generation will take a long time (up to a minute). Subsequent runs use the cached TRT plan file and should be faster.

This app provides an interface on the Carter2.3 robot to run the inference pipeline open loop, while controlling the robot with a joystick. The egm inference results are then visualzied on Sight. To deploy the app on the Carter2.3:

Copy
Copied!
            

cd isaac/sdk/ && ./../engine/engine/build/deploy.py -t //extensions/egm_fusion/apps/shako_carter_open_loop_inference:shako_carter_open_loop_inference -r carter-v23-9.dyn.nvidia.com -c jetpack51

To run the app on Carter2.3, ssh into it and execute the shell script provided once the app is succesfully deployed. Such as

Copy
Copied!
            

/home/nvidia/deploy/<USERNAME>/shako_carter_open_loop_inference/run_shako_carter_open_loop_inference.sh

In your browser, open <hostname_of_carter>:3000 for visualizing the input hawk images and the output egm.

Note

When running the above applications for the first time, TRT plan generation will take a long time (up to a minute). Subsequent runs use the cached TRT plan file and should be faster.

This app provides live inference of the shako model architecture from replayed pod files.

Copy
Copied!
            

dazel run //extensions/egm_fusion/apps/shako_pod_replay_inference:shako_pod_replay_inference -- -p <POD_PATH> -m <MODEL_PATH>

  • $POD_PATH: Abolute path to the podfile to be replayed

  • $MODEL_PATH: Abolute path to the folder holding all required model files (Onnx files bucket_encoder.onnx, camera_encoder.onnx, camera_unprojection.onnx, egm_decoder.onnx, fusion.onnx and the precomputation tables lookup.dat, row_embedding.dat and radial_embedding.dat)

Note

When running the above applications for the first time, TRT plan generation will take a long time (up to a minute). Subsequent runs use the cached TRT plan file and should be faster.

To visualize the inference, open localhost:3000 in your browser. You should be able to visualize the input hawk images as greyscale and the inference egm result.

© Copyright 2018-2023, NVIDIA Corporation. Last updated on Oct 30, 2023.