The Occupancy Grid Mapping sample showcases a Bayesian occupancy grid. The sample allocates a grid and allows the application to insert point clouds or object lists, which are polygons stored as a list of points, on a stationary-to-moving vehicle. You can observe the point clouds or objects being accumulated, and as the vehicle moves, the grid is updated according to the probabilities given in the initialization.
In the initialization of each layer of the occupancy grid, a set of probabilities is specified. For point clouds, these probabilities correspond to the actual probability of a grid cell being free at the actual point, the probability that sensor origin is free space, and the probability of free space beyond the point. Point clouds are inserted by casting a ray from the sensor origin to the point. For object lists, only the probability that a cell is free at the object is specified. Additionally, each layer specifies the sensor to sensor rig transformation so that when the points are inserted, they are oriented correctly in the grid.
The sample application uses three sensors in combination: CAN data for the car position and orientation, Lidar for point cloud data, and camera for visualization.
The occupancy grid sample, sample_occupancy_grid, accepts the following optional parameters. If none are specified, it will process pre-recorded data.
./sample_occupancy_grid --canFile=[path/to/can/file]
--dbcFile=[path/to/dbc/file]
--lidarFile=[path/to/lidar/file]
--videFile=[path/to/video/file]
--videoTimestampFile=[path/to/video/timestamp/file]
--fps=[integer]
where
--canFile=[path/to/can/file]
Is the recorded can data.
Default value: path/to/data/samples/occupancy_grid/can_vehicle.bin
--dbcFile=[path/to/dbc/file]
Is the DBC file for interpreting can.
Default value: path/to/data/samples/occupancy_grid/DataspeedByWire.dbc
--lidarFile=[path/to/lidar/file]
Is the recorded lidar data.
Default value: path/to/data/samples/occupancy_grid/lidar_0_front-center.bin
--videFile=[path/to/video/file]
Is the recorded h264 video.
Default value: path/to/data/samples/occupancy_grid/video_front_center.h264
--videoTimestampFile=[path/to/video/timestamp/file]
Is the timestamp file associated with the videoFile.
Default value: path/to/data/samples/occupancy_grid/video_time_0.txt
--fps=[integer]
Is the speed at which the sample plays.
Default value: 30
All inputs are assumed to be in the same format that the NVIDIA® DriveWorks recording tool uses when recording. The videoFile must be h264 and must have an associated videoTimestampFile in order to be able to synchronize with the other sensors properly. Additionally, the sensors must be recorded at the same time so that they can be synchronized and played back together.
./sample_occupancy_grid --canFile=can.bin --dbcFile=can.dbc --lidarFile=lidar.bin --videoFile=video.h264 --videoTimestampFile=video_time.txt --fps=30
The expected output is a Bayesian occupancy grid with areas that are free colored white, and areas that are not free as black. Everything unknown is gray. The colors represent the probability of a cell being free.
For more details see freespace_occupancygrid_mainsection .