DriveWorks SDK Reference
3.5.78 Release
For Test and Development only

PilotNet Sample
SW Release Applicability: This sample is available in NVIDIA DRIVE Software releases.


The PilotNet sample demonstrates how to use the NVIDIA DRIVE proprietary deep neural network called PilotNet to perform path prediction. The PilotNet sample streams frames either from a mp4/h264 video file or from a live camera and runs the PilotNet DNN inference on each frame to predict paths for lane stable, lane change and lane fork maneuvers in world coordinates. Based on the prediction, the sample app plots the trajectories in the image space.

The sample model uses a video captured by a forward-facing Sekonix Camera Module(AR0231 RCCB sensor) with a 60-degree field of view and a resolution of 940x604 pixels. The associated camera in the rig file is called the pilot camera.

Running the Sample

The PilotNet sample, sample_pilotnet, accepts the following optional parameters. If none are specified, it will perform path perception on pre-recorded video.

./sample_pilotnet --input-type=[video|camera]


    Defines if the input is from live camera or from a recorded video.
    Live camera is only supported on NVIDIA DRIVE<sup>&trade;</sup> platform.
    Default value: video

    Name of the AR0231 `RCCB` camera sensor.
    Only applicable if --input-type=camera.
    Default value: SF3324

    Is the group where the camera is connected to.
    Only applicable if --input-type=camera.
    Default value: csi-a

    Indicates the camera index on the given interface.
    Only applicable if --input-type=camera.
    Default value: 0

    Path to video file to be used for replay.
    Default value: path/to/data/samples/pilotnet/front_center_60fov.mp4

    Path to the rig file containg the calibration for the video specified by --video.
    Name of the camera must be pilot
    Default value: path/to/data/samples/pilotnet/rig.json

    Setting the parameter to 0 will exit the app once the video is completed.
    Setting the parameter to 1 will run the video in an infinite loop until the user exits the app by pressing the [Esc] key.
    Default value: 1

    Directory for storing PilotNet ROI as a ppm image for each frame.
    If not provided, no ROI image is stored.
    Default value: ""

    Print the network prediction for each trajectory.
    Default value: 0

Linux Examples

The sample is setup to run a default mp4 video in a loop.


Use the –video and –rig option to run a custom mp4/h264 video file and rig file. Make sure that the rig file corresponds to the car from which the video was recorded from.

./sample_pilotnet --video=[path/to/video] --rig=[path/to/rigfile]

Use the –loop option to enable/disable looping of video file.

./sample_pilotnet --loop=[0|1]

Use the –printOutput option to print the world space trajectory coordinates predicted by PilotNet.

./sample_pilotnet --printOutput=[0|1]

NVIDIA DRIVE<sup>™</sup> Examples:

./sample_pilotnet --input-type=camera --camera-name=SF3324 --interface=csi-a --link=3

where <camera-name> is a supported RCCB sensor. See Cameras Supported for the list of supported cameras for each platform.


PilotNet Sample replays a video and predicts the paths for different maneuvers which are color-coded as follows

  • Red: Lane Stable
  • Purple: Lane Change Left First Half
  • Blue: Lane Change Right First Half
  • Light Green: Lane Change Left Second Half
  • White: Lane Change Right Second Half
  • Yellow: Lane Split Left
  • Dark Green: Lane Split Right

Additionally, the green highlighting in the video specifies the areas of the ROI that are of interest to the network.

PilotNet Sample

Additional Information

For more details see PilotNet and Pilotnet Detector.