1 # Copyright (c) 2019-2020 NVIDIA CORPORATION. All rights reserved.
3 @page egomotion_mainsection Egomotion
7 The Egomotion module tracks and predicts a vehicle's pose, on the basis of a motion model,
8 given measurements from multiple sensors. The motion model is selected at initialization time.
9 During run-time, the module takes measurements as input and internally updates the current
10 estimation of the vehicle pose. The module can be queried for vehicle motion between any two
13 Odometry only based model, selectable with `dwEgomotionParameters.motionModel` set to
14 `::DW_EGOMOTION_ODOMETRY` type, estimates the vehicle motion based on velocity [m/s] and
15 steering angle [rad] on the road measurements. In addition the model expects various vehicle
16 information to be passed during initialization phase, such as `mass`, `wheelBase`, `inertia`,
17 etc. Vehicle parameters must be passed through the `dwEgomotionParameters.vehicle` parameter.
18 The model assumes a driving rear axle and steerable front wheels, driving on a 2D plane. The
19 model supports 3 degrees of freedom, 2D position and yaw angle for orientation. That means any
20 request to relative motion estimation between two timestamps, is determined only up-to these 3
21 degrees of freedom. In addition the model supports estimation of yaw-rate as well as lateral
22 acceleration. Predictions are done assuming constant steering angle and velocity during the
25 To select IMU-based motion estimation, specify `::DW_EGOMOTION_IMU_ODOMETRY` as the type of the
26 motion model. In this mode, the module estimates the car motion and its orientation based on
27 velocity, steering angle, and IMU measurements, such as gyroscope and linear accelerometer. The
28 change in position is estimated with the same Ackerman principle as the odometry only motion
29 model, with the difference of moving along the current estimation of the body plane. The
30 orientation, however, is estimated using a filter that fuses gyroscope and linear accelerometer
31 measurements. For this to function properly an Inertial Measurement Unit (IMU) is required. The
32 IMU must return gyroscope as well as linear accelerometer measurements in all 3 axis. In addition
33 a known calibration of the mapping of IMU to Vehicle coordinate system must be known, see
34 `dwEgomotionParameters.imu2rig`. The motion model supports full 6 DoF estimation, i.e. position
35 as well as orientation.
37 The Egomotion module internally maintains a history of poses to allow a query of relative poses
38 between any two timestamps. The returned relative pose represents the relative motion of the
39 vehicle, i.e. change of orientation and change of translation, that the vehicle performed from
40 timestamp A to timestamp B in the local flat Euclidean space. If a timestamp is in the future,
41 then a prediction happens. This prediction allows the module to make assumptions about how the
42 vehicle will move, given the last known state of the sensors.
44 The Egomotion module also provides a filter fusing relative motion estimation with a GNSS sensor.
45 This fusion is handled by the Global Egomotion variant, which can take any relative motion
46 estimates as input. Refer to the documentation provided by @ref global_egomotion_group.
50 - @ref egomotion_usecase1
51 - @ref egomotion_usecase2
55 - @ref egomotion_group
56 - @ref global_egomotion_group
57 <!-- - @ref egomotion_state_group - HIDDEN API -->