DriveWorks SDK Reference 3.5.78 Release For Test and Development only
Egomotion
Note
SW Release Applicability: This module is available in both NVIDIA DriveWorks and NVIDIA DRIVE Software releases.

The Egomotion module tracks and predicts a vehicle's pose, on the basis of a motion model, given measurements from multiple sensors. The motion model is selected at initialization time. During run-time, the module takes measurements as input and internally updates the current estimation of the vehicle pose. The module can be queried for vehicle motion between any two points in time.

Odometry only based model, selectable with dwEgomotionParameters.motionModel set to DW_EGOMOTION_ODOMETRY type, estimates the vehicle motion based on velocity [m/s] and steering angle [rad] on the road measurements. In addition the model expects various vehicle information to be passed during initialization phase, such as mass, wheelBase, inertia, etc. Vehicle parameters must be passed through the dwEgomotionParameters.vehicle parameter. The model assumes a driving rear axle and steerable front wheels, driving on a 2D plane. The model supports 3 degrees of freedom, 2D position and yaw angle for orientation. That means any request to relative motion estimation between two timestamps, is determined only up-to these 3 degrees of freedom. In addition the model supports estimation of yaw-rate as well as lateral acceleration. Predictions are done assuming constant steering angle and velocity during the time delta.

To select IMU-based motion estimation, specify DW_EGOMOTION_IMU_ODOMETRY as the type of the motion model. In this mode, the module estimates the car motion and its orientation based on velocity, steering angle, and IMU measurements, such as gyroscope and linear accelerometer. The change in position is estimated with the same Ackerman principle as the odometry only motion model, with the difference of moving along the current estimation of the body plane. The orientation, however, is estimated using a filter that fuses gyroscope and linear accelerometer measurements. For this to function properly an Inertial Measurement Unit (IMU) is required. The IMU must return gyroscope as well as linear accelerometer measurements in all 3 axis. In addition a known calibration of the mapping of IMU to Vehicle coordinate system must be known, see dwEgomotionParameters.imu2rig. The motion model supports full 6 DoF estimation, i.e. position as well as orientation.

The Egomotion module internally maintains a history of poses to allow a query of relative poses between any two timestamps. The returned relative pose represents the relative motion of the vehicle, i.e. change of orientation and change of translation, that the vehicle performed from timestamp A to timestamp B in the local flat Euclidean space. If a timestamp is in the future, then a prediction happens. This prediction allows the module to make assumptions about how the vehicle will move, given the last known state of the sensors.

The Egomotion module also provides a filter fusing relative motion estimation with a GNSS sensor. This fusion is handled by the Global Egomotion variant, which can take any relative motion estimates as input. Refer to the documentation provided by Global Egomotion Interface.