DriveWorks SDK Reference
4.0.0 Release
For Test and Development only

src/dw/sensors/sensormanager/docs/mainsection.md
Go to the documentation of this file.
1 # Copyright (c) 2019-2021 NVIDIA CORPORATION. All rights reserved.
2 
3 @page sensormanager_mainsection Sensor Manager
4 
5 ## About This Module
6 
7 Sensor Manager uses underlying SAL (Sensor Abstraction Layer) APIs to provide applications with a higher-level API to access sensor events. It is a thread-less module, whose purpose is to provide time-sorted stream of events from an aggregate set of sensors. Key features include:
8 
9 - Configuration: instantiation of sensors based on JSON configuration data.
10 - Sensor Event Access: strictly time-sorted event stream from all sensors in case of "virtual" sensors (i.e. sensor data playback from file); and weakly-sorted event stream from all sensors if running in live mode (with real sensors).
11 
12 Implementation details:
13 
14 - Blocking poll operations with timeout support.
15 - Support for multiple event consumers.
16 
17 ## Sensor Abstraction Layer
18 
19 In general, the NVIDIA<sup>&reg;</sup> DriveWorks SAL implements sensor functionality into two modular layers:
20 
21 - Device type: DriveWorks provides implementations for supported sensor interfaces (e.g. Radar, Lidar, Camera, etc.)
22 - Device-specific decoder: handles parsing and interpretation of raw data for a given sensor model.
23 
24 In addition to the device decoders already implemented within the DriveWorks library, customers may implement additional device decoders using plug-in interfaces. The list of interfaces to be implemented is defined in an interface header file that is specific to sensor type. As part of application deployment, this pluggable interface is compiled as a shared object and loaded by DriveWorks at runtime.
25 
26 The current API definition for the SAL can be found in the DriveWorks SDK documentation.
27 
28 ## Sensor Manager
29 
30 Sensor Manager is agnostic to specific sensor type, instead, it provides unified access to all types of sensors that are supported by SAL. Sensors instantiation is handled through the `::dwRigHandle_t` (see @ref rig_mainsection). An example configuration file is shown below. This configuration defines a one-camera setup (ar0231 RCCB cameras), CAN bus, and a GPS sensor.
31 
32 ```
33 {
34  "rig": {
35  "sensors": [
36  {
37  "name": "camera:front:center:60fov",
38  "parameter": "camera-type=ar0231-rccb,camera-group=a,camera-count=1,camera-mask=0001,siblingIndex=0",
39  "properties": {
40  "Model": "ftheta",
41  "width": "1920",
42  "height": "1208",
43  "cx": "960",
44  "cy": "604",
45  "bw-poly": "0.0 0.000545421498827636 -1.6216719633103e-10 -4.64720492990289e-12 2.85224527762934e-16"
46  },
47  "protocol": "camera.gmsl",
48  "nominalSensor2Rig": {
49  "quaternion": [-0.502444, 0.507493, -0.497444, 0.492494],
50  "t": [1.749, -0.1, 1.47]
51  },
52  "sensor2Rig": {
53  "quaternion": [-0.502444, 0.507493, -0.497444, 0.492494],
54  "t": [1.749, -0.1, 1.47]
55  }
56  },
57  {
58  "name": "can:vehicle",
59  "nominalSensor2Rig": { "quaternion": [ 0.0, 0.0, 0.0, 1.0 ],
60  "t": [ 0.0, 0.0, 0.0 ] },
61  "parameter": "device=can0",
62  "properties": null,
63  "protocol": "can.socket",
64  "sensor2Rig": { "quaternion": [ 0.0, 0.0, 0.0, 1.0 ],
65  "t": [ 0.0, 0.0, 0.0 ] }
66  },
67  {
68  "name": "gps:dataspeed",
69  "nominalSensor2Rig": {
70  "quaternion": [0.0, 0.0, 0.0, 1.0],
71  "t": [-0.2, -0.1, 0.4]
72  },
73  "parameter": "can-proto=can.socket,can-params=device=can0",
74  "properties": null,
75  "protocol": "gps.dataspeed",
76  "sensor2Rig": {
77  "quaternion": [0.0, 0.0, 0.0, 1.0],
78  "t": [-0.2, -0.1, 0.4]
79  }
80  }
81  ],
82  },
83  "version": 2
84 }
85 ```
86 
87 ## Grouping camera sensors
88 
89 One of the features of the module is a grouping events from synchronized cameras with the ability to request them in scope
90 with single acquiring call. In order to combine multiple cameras into a group, the user reflects this in configuration file.
91 `camera-group` sensor parameter indicates to which named group camera should be refered. In the following example both cameras
92 are assigned to the same group. Therefore, each acquired event will contain two frames, one for each camera.
93 
94 ```
95 {
96  "rig": {
97  "sensors": [
98  {
99  "name": "camera:front:center:60fov",
100  "parameter": "file=front_camera.mp4,camera-group=main_group",
101  "protocol": "camera.virtual"
102  },
103  {
104  "name": "camera:rear:center:60fov",
105  "parameter": "file=rear_camera.mp4,camera-group=main_group",
106  "protocol": "camera.virtual"
107  },
108  ],
109  },
110  "version": 2
111 }
112 ```
113 
114 If the user wants to split the events, it is enough to specify different names in `camera-group` parameter:
115 
116 ```
117 {
118  "rig": {
119  "sensors": [
120  {
121  "name": "camera:front:center:60fov",
122  "parameter": "file=front_camera.mp4,camera-group=separate-group-1",
123  "protocol": "camera.virtual"
124  },
125  {
126  "name": "camera:rear:center:60fov",
127  "parameter": "file=rear_camera.mp4,camera-group=separate-group-2",
128  "protocol": "camera.virtual"
129  },
130  ],
131  },
132  "version": 2
133 }
134 ```
135 
136 In this case acquired event will contain single frame from single camera with minimum timestamp.
137 
138 It is assumed that all virtual cameras being processed by the module are synchronized. Therefore
139 by default they all combined into a single group. That is, the first example is the same as
140 the following:
141 
142 ```
143 {
144  "rig": {
145  "sensors": [
146  {
147  "name": "camera:front:center:60fov",
148  "parameter": "file=front_camera.mp4",
149  "protocol": "camera.virtual"
150  },
151  {
152  "name": "camera:rear:center:60fov",
153  "parameter": "file=rear_camera.mp4",
154  "protocol": "camera.virtual"
155  },
156  ],
157  },
158  "version": 2
159 }
160 ```
161 
162 #### Input
163 
164 The following are the input requirements to the Sensor Manager:
165 
166 - All sensors used by the Sensor Manager (defined in a JSON configuration file) must be supported by the Sensor Abstraction Layer (SAL).
167 - All sensor data must be timestamped (as provided by SAL).
168 
169 #### Output
170 
171 The only output requirements for the Sensor Manager is to provide the respective sensor data as specified at the given event when the function to acquire the next event is called.
172 
173 ## Relevant Tutorials
174 
175 - @ref sensormanager_usecase1