1 # Copyright (c) 2019-2020 NVIDIA CORPORATION. All rights reserved.
3 @page dwx_freespace_detection_sample Freespace Detection Sample (OpenRoadNet)
6 @note SW Release Applicability: This sample is available in **NVIDIA DRIVE Software** releases.
8 @section dwx_freespace_detection_description Description
10 Drive-able collision-free space, i.e. the space that can be immediately reached without
11 collision, provides critical information for the navigation in autonomous driving.
12 This free-space example demonstrates the NVIDIA end-to-end technique of detecting the
13 collision-free space in the road scenario. The problem is modeled within a deep
14 neural network (OpenRoadNet), with the input being a three-channel RCB image and
15 the output being a boundary across the image from the left to the right. The boundary
16 separates the obstacle from open road space. In parallel, each pixel on the
17 boundary is associated with one of the four semantic labels:
23 This @ref freespace_mainsection sample has been trained using RCB images
24 with moderate augmentation.
26 This sample consumes an H.264 or RAW video and computes the free
27 space boundary on each frame. The sample can also consume video from cameras.
31 The image datasets used to train OpenRoadNet have been captured using a View Sekonix
32 Camera Module (SS3324, SS3325) with AR0231 RCCB sensor. The camera is mounted high up at the
33 rear-view mirror position. Demo videos are captured at 2.3 MP.
35 To achieve the best free-space detection performance, NVIDIA recommends to adopt
36 a similar camera setup and align the video center vertically with the horizon before
39 @section dwx_freespace_detection_running Running the Sample
41 The freespace detection sample, `sample_freespace_detection`, accepts the following optional parameters. If none are specified, the sample performs detections on four supplied pre-recorded video.
43 ./sample_freespace_detection --input-type=[video|camera]
44 --rig=[path/to/rig/file]
45 --video=[path/to/video]
46 --camera-type=[camera]
47 --camera-group=[a|b|c|d]
48 --camera-index=[0|1|2|3]
50 --maxDistance=[fp_number]
54 --input-type=[video|camera]
55 Defines if the input is from live camera or from a recorded video.
56 Live camera is only supported on On NVIDIA DRIVE platform.
59 --rig=[path/to/rig/file]
60 Points to the rig file containing camera properties.
61 Default value: path/to/data/samples/freespace/rig.json.
63 --video=[path/to/video]
64 Is the absolute or relative path of a raw or h264 recording.
65 Only applicable if --input-type=video
66 Default value: path/to/data/samples/freespace/video_freespace.h264.
68 --camera-type=[camera]
69 Only applicable if --input-type=camera.
70 Default value: ar0231-rccb-bae-sf3324
72 --camera-group=[a|b|c|d]
73 Is the group where the camera is connected to.
74 Only applicable if --input-type=camera.
77 --camera-index=[0|1|2|3]
78 Indicates the camera index on the given port.
82 Setting this parameter to 1 when running the sample on Xavier B allows to access a camera that
83 is being used on Xavier A. Only applicable if --input-type=camera.
86 --maxDistance=[fp_number]
87 Defines the maximum distance in meters at which free space boundary distance can be distinguished.
90 @subsection dwx_freespace_det_examples Examples
92 #### To run the sample on the Linux host (x86)
94 ./sample_freespace_detection --video=<video file.h264> --rig=<calibration file.json>
97 ./sample_freespace_detection --video=<video file.raw> --rig=<calibration file.json>
99 #### To run sample on NVIDIA DRIVE platforms with cameras
101 ./sample_freespace_detection --input-type=camera --camera-type=<camera_type> --camera-group=<camera_group> --rig=<calibration file.json>
103 where `<camera type>` is a supported `RCCB` sensor.
104 See @ref supported_sensors for the list of supported cameras for each platform.
106 @note The free-space detection sample directly resizes video frames to the network input
107 resolution. Therefore, to get the best performance, it is suggested to use videos with
108 similar aspect ratio to the demo video. Or you can set the Region of Interest (ROI) to perform
109 inference on a sub-window of the full frame.
111 @section dwx_freespace_det_output Output
113 The free-space detection sample:
116 - Overlays ploylines for the detected free-space boundary points.
117 - Computes boundary points in car coorindate system, if a valid camera calibration file is provided.
119 The colors of the ploylines represent the types of obstacle the boundary interface with:
126 
128 @section dwx_freespace_det_more Additional Information
130 For more information, see @ref freespace_mainsection.