Configuration

For more context on configuration, when the microservice is used

  • In the Multi-Camera Tracking app, please refer to its Operation Parameters section.

  • In the Occupancy Analytics app, please refer to its Operation Parameters section.

  • As a standalone microservice, refer to the README.md in its respective directory within metropolis-apps-standalone-deployment/modules/.

ConfigFile

{
 "kafka": {
   "brokers": "broker-ip:9092",
   "topics": [
     {
       "name": "rawTopic",
       "value": "mdx-raw"
     }
   ]
 },
 "sensors": [
   {
     "id": "default",
     "configs": [
       {
         "name": "tripwire_min_points",
         "value": "5"
       }
     ]
   }
 ],
 "milvus": {
   "host": "localhost",
   "port": "19530",
   "collection": "mdxBehavior",
   "indexType": "IVF_FLAT",
   "metricType": "IP",
   "vectorDim": "256",
   "sensorIdMaxLength": "100",
   "indexParam": "{\"nlist\":128}",
   "partitioningStrategy": "day",
   "clusterThreshold": "0.9",
   "triggerInterval": "60 seconds",
   "bboxConfidence": "0.5",
   "bboxSize": "200"
 },
 "triton": {
   "host": "localhost",
   "port": "8001"
 },
 "spark": [
   {
     "name": "spark.sql.shuffle.partitions",
     "value": "8"
   },
   {
     "name": "checkpoint_dir",
     "value": "checkpoint"
   },
   {
     "name": "mode",
     "value": "cluster"
   },
   {
     "name": "dataType",
     "value": "protobuf"
   },
   {
     "name": "resetMilvus",
     "value": "false"
   }
 ]
}
  • Kafka Broker, defines the list of broker host:port. The sensor-processing sends all metadata to this Kafka broker, Spark streaming consumes metadata.

  • Kafka topics, the following topics are used by the Spark streaming pipeline:

    Parameters

    Name

    Description

    rawTopic

    Default value mdx-raw, comma separated list of topics, the sensor-processing layer sends metadata to this topic. More than one topic is needed when the tracker is switched off for a few sensors in the sensor-processing layer (DeepStream) and run outside. One example can use the external tracker for object tracking based on the bounding box generated by the sensor-processing layer and then send the tracked messages to a different topic.

    behaviorTopic

    Default value is mdx-behavior, all behavior data is sent to this topic by the Spark streaming pipeline, see Behavior Processing.

    behaviorPlusTopic

    Default value is mdx-behavior-plus

    alertsTopic

    Default value is mdx-alerts, all anomalies are sent to this topic by Spark streaming pipeline.

    tripwireTopic

    Default value is mdx-tripwire

    framesTopic

    Default value is mdx-frames, mostly used for enriched frames metadata, the pipeline may or may not use this topic

    notificationTopic

    Default value is mdx-notification, used for notification, such as calibration update, sensor add or delete.

  • Spark pipeline Config

    • spark.sql.shuffle.partitions - To optimize the executor core usage, i.e., distribute the load across all CPU cores available, default is 8. This value can be equal to number of kafka partitions of mdx-raw topic.

    • checkpoint_dir - Use for Kafka checkpoint and behavior state management.

    • input data type - default value “protobuf”, other option is “JSON”.

    • resetMilvus - default is false, if set to true the spark pipeline once started it will reset the milvus collection.

  • Milvus Config

    • host: milvus host IP address or dns name,

    • port: default 19530

    • collection: default mdxBehavior

    • indexType: default IVF_FLAT see milvus doc for other indexType

    • metricType: default IP, as the vectors are normalized, it result in cosine similarity

    • vectorDim: default 256,

    • sensorIdMaxLength: default 100, increase the length if sensor-id are long

    • indexParam: default {"nlist":128},

    • partitioningStrategy: default day, other option is week

    • clusterThreshold: default 0.9,

    • triggerInterval: default 60 seconds,

    • bboxConfidence: default object detection confidence threshold 0.5,

    • bboxSize: default bbox threshold 200

  • Triton Config

    • host: triton host IP address or dns name

    • port: default grpc 8001

  • Tripwire

    • tripwire_min_points - Minimum number of points a object must to detected before and after the tripwire to generate a trip event. Default is 5.

  • Social Distancing

    • proximity_detection_threshold - Used for checking social distancing violation between two objects, default is 1.8. Distance is measured in meters.

  • State Mgmt

    • behavior_state_timeout - If an object is not detected or the frame message arrives late for a configurable period of time, default is 5 seconds. The object state will be deleted from memory and disk, if an object with same id is sent again, a new behavior will be created. The uniqueness of the behavior is based on sensor + id + start timestamp.

  • Other Configs

    • valid_behavior_length - Minimum number of points or object detection needed to form a behavior. Default is 10.

    • cluster_time_interval_threshold - Minimum time-interval needed for behavior for cluster Inference. Default is 2.

  • Overriding default config

    • All sensors uses the configuration defined using sensor “id”: “default”.

      {
      "sensors": [
          {
            "id": "default",
            "configs": [
                ...
            ]
         },
         ...
      }
      
    • You can override one or more default configuration for a given sensor by adding a new entry with the sensor name, the following changes default value for valid_behavior_length for sensor-id Warehouse_Cam1:

        {
           "sensors": [
           {
              "id": "default",
              "configs": [
                  ...
              ]
           },
           {
              "id": "Warehouse_Cam1",
              "configs": [
                  {
                      "name": "valid_behavior_length",
                      "value": "11"
                  },
                  ...
              ]
            }
            ]
           }
      

CalibrationFile

Use the Calibration tool to generate the JSON, for more details, refer to the Camera Calibration. The calibration JSON structure is shown below:

{
"version": "1.0",
"osmURL": "",
"calibrationType": "cartesian",
"sensors": [
    {
        "type": "camera",
        "id": "Warehouse_Cam_1",
        "origin": {
            "lng": 0,
            "lat": 0
        },
        "geoLocation": {
            "lng": 0,
            "lat": 0
        },
        "coordinates": {
            "x": 0,
            "y": 0
        },
        "scaleFactor": 1,
        "attributes": [
            {
                "name": "fps",
                "value": "30"
            },
            {
                "name": "depth",
                "value": ""
            },
            {
                "name": "fieldOfView",
                "value": ""
            },
            {
                "name": "direction",
                "value": ""
            },
            {
                "name": "source",
                "value": "nvstreamer"
            },
            {
                "name": "frameWidth",
                "value": "1920"
            },
            {
                "name": "frameHeight",
                "value": "1080"
            }
        ],
        "place": [],
        "imageCoordinates": [
            {
                "x": 157,
                "y": 1013
            },
            {
                "x": 375,
                "y": 765
            },
            {
                "x": 543,
                "y": 482
            },
            {
                "x": 628,
                "y": 402
            },
            {
                "x": 694,
                "y": 345
            },
            {
                "x": 904,
                "y": 291
            },
            {
                "x": 1047,
                "y": 95
            },
            {
                "x": 1230,
                "y": 105
            },
            {
                "x": 1279,
                "y": 282
            },
            {
                "x": 1001,
                "y": 138
            },
            {
                "x": 1217,
                "y": 71
            }
        ],
        "globalCoordinates": [
            {
                "x": 302,
                "y": 535
            },
            {
                "x": 373,
                "y": 535
            },
            {
                "x": 519,
                "y": 503
            },
            {
                "x": 589,
                "y": 503
            },
            {
                "x": 661,
                "y": 503
            },
            {
                "x": 738,
                "y": 642
            },
            {
                "x": 1369,
                "y": 658
            },
            {
                "x": 1259,
                "y": 843
            },
            {
                "x": 740,
                "y": 843
            },
            {
                "x": 1131,
                "y": 640
            },
            {
                "x": 1482,
                "y": 815
            }
        ],
        "tripwires": [],
        "rois": []
    }
  ]
}

A calibration comprises of array of sensors, where each sensor record consist of multiple attributes, the ones which are used by the pipeline are:

  • id : Unique id of the sensor.

  • type : Type of the sensor. For example, camera.

  • origin : Comprises of:

    • origin.lat : Locations need to be in Cartesian coordinates often. A small area like city can be treated as planar and all locations of city can be measured using Cartesian coordinates. A random or specific location of the city can be used as the origin. originLat represents the latitude of the origin.

    • origin.lng : Represents the longitude of the origin.

  • geoLocation : It is the camera Geo location of the sensor, consists of [lat,lng].

  • coordinates : It is the camera location of the sensor in cartesian coordinates, consists of [x,y].

  • place : Array of name value pair, to represent a place, example city=santa-clara/building=bldg_K/room=G

  • imageCoordinates : Object detection comprise of bbox around the object, the bottom middle of bbox which is close to foot of a person is considered for representation the image coordinate location. These coordinates are mapped to globalCoordinates by the calibration tool. The mapping is used to generate the homography matrix.

  • globalCoordinates : See imageCoordinates mentioned in the previous point.

  • rois : Represents array of region of interest, for object movement analysis, example object moving object on the road or specific area in a building.

  • tripwire : Represents array of tripwire, tripwire is usually drawn at the entrance of the door, so count the number of people going in or getting out. For more details see the Tripwire Events section.