CoarseToFineStereoDepth

Isaac provides CoarseToFineStereoDepth, a depth estimation algorithm that uses GPU acceleration to determine stereo disparity and convert it to stereo depth in real-time.

Stereo disparity refers to the difference in coordinates of similar features within two stereo images, resulting from the horizontal distance between two cameras (parallax). By comparing these two stereo images, the relative depth information can be obtained in the form of a disparity map, which encodes the difference in horizontal coordinates of corresponding image points. The values in this disparity map are inversely proportional to the scene depth at the corresponding pixel location.

CoarseToFineStereoDepth can accurately infer the depth of 90% of the pixels up to 5 m with an overall D1 score of 8.3%, tested on a dataset of 80 synthetic stereo images with 512 x 256 resolution per eye. On Jetson Xavier, it can process stereo images at VGA resolution (672 x 376 each) from the Zed Camera at 27 frames per second (FPS).

The images below are the result of CoarseToFineStereoDepth estimates of depth/disparities with the footage captured from the stereo ZED Camera.

../../../_images/c2f1.jpg ../../../_images/c2f2.jpg ../../../_images/c2f3.jpg

Source Code

Isaac uses stereo depth detection code in the form of a static library.

CoarseToFineStereoDepth is wrapped as an Isaac codelet, and is available in the Isaac repository.

Isaac Codelet

The Isaac codelet wrapping CoarseToFineStereoDepth detection takes a pair of rectified stereo images, and publishes a dense depth map (in meters) where each pixel in the map corresponds to the depth value of that pixel in the input image from the left camera. The codelet uses camera intrinsics and extrinsics to compute the stereo depth. The minimum and maximum values for the output depth can be provided as input parameters via a configuration file.

Running the sample application

The CoarseToFineStereoDepth sample application uses a ZED stereo camera. First connect the ZED camera to the host system or the Jetson platform you are using. Then use one of the following procedures to run the included sample application.

To Run the Sample Application on the Host System

  1. Build the sample application with the following command:

    bob@desktop:~/isaac$ bazel build //apps/samples/stereo_matching_depth
    
  2. Run the sample application with the following command:

    bob@desktop:~/isaac$ bazel run //apps/samples/stereo_matching_depth/stereo_matching_depth
    

To Run the Application on Jetson

  1. Build a package on the host and then deploy it to the Jetson system.

  2. Run the following command on the host computer, where <JETSON_IP> is replaced by the IP address of your Jetson system.

    bob@desktop:~/isaac$ ./engine/build/deploy.sh -p //apps/samples/stereo_matching_depth:stereo_matching_depth-pkg -d jetpack42 -h <JETSON_IP>
    
  3. Log on to the Jetson system and run the application with the following commands:

    bob@jetson:~/$ cd deploy/bob/stereo_matching_depth-pkg
    bob@jetson:~/deploy/bob/stereo_matching_depth-pkg$ ./apps/samples/stereo_matching_depth/stereo_matching_depth
    

    Where “bob” is your user name on the host system.

To View Output from the Application in Websight

While the application is running, open Isaac Sight in a browser by navigating to http://localhost:3000. If running the application on a Jetson platform, make sure to use the IP address of the Jetson system instead of localhost.

In Websight, a window called LeftCamera shows the left input image and a window called Depth shows the depth estimated by the algorithm. The depth values are normalized between the min and max depth depth values provided to the Depth Visualizer codelet. Each pixel in the depth map corresponds to the depth, in meters, of that pixel in the left input image.

../../../_images/c2f4.png