Traffic Lane Zone

The traffic lane zone is available on the Qb2 product.

It can not be used in combination with security or volume zones.

The traffic lane zone tracks vehicles driving through the zone on a single lane.

Application

Velocity Estimation

Using multiple detections of a vehicle in the zone, its velocity can be calculated.

Size Estimation

While the vehicle passes through the scene, its point cloud is aggregated in every step to create a dense final result and measure the object dimensions.

Vehicle Classification

Using a support vector machine, an 8+1 classification of vehicle types is possible, for more details see the pages on classification and object classes.

Placement

The traffic lane zone should be placed in the scene so that it expands along the y-axis, and rotated so the zone label faces the origin. The vehicles should move in negative y-direction towards the Qb2. Additionally, the front of the zone should end where the field of view of the sensor ends, in order for events to be triggered when the vehicles leave the scene. For a single sensor, it is beneficial to place the zone so that the center line of the field of view of the sensor is in the middle of the lane.

traffic lane placement
Figure 1. Placement of Multiple Zones
traffic lane placement single
Figure 2. Placement of a Single Zone

Parameters

See configuration API definition.

Expected Velocity (unit: \(\frac{m}{s}\))

Velocity that the vehicles are expected to have inside the lane zone. This is used as initial velocity during the object tracking.

Algorithm

The algorithm working on the traffic lane zone uses its enclosed portion of the foreground point cloud as input data. On these input points, object detection is performed. Afterwards, the objects are tracked and a class is assigned to them.

Tracking

For each frame, the following tracking steps are performed:

  1. Predict the next position of already tracked vehicles.

  2. Combine detected clusters that belong to a single vehicle.

  3. Assign existing tracks to detected vehicles.

  4. Update the vehicles' position and velocity, motion correct and aggregate the point clouds.

  5. Create new tracks for vehicles that were detected for the first time.

  6. Delete tracks of vehicles that left the zone.

  7. Send out the current state of the traffic lane and events for each vehicle that left the zone.

Motion Correction

Since not all points of a point cloud are captured at the same time, objects moving at high speeds will be subject to motion distortion. By using the estimated velocity of the vehicle, this effect can be corrected.

motion distortion
Figure 3. Motion Distorted Vehicle
motion correction
Figure 4. Motion Corrected Vehicle
Aggregation

Over the course of the vehicle moving though the traffic lane and closer to the sensor, more and more parts of it become visible. For very long vehicles like trucks or buses with trailers, not all parts of the vehicle might be visible at the same time. Using knowledge about the position of the object, multiple detections across frames can be combined into a single, more dense point cloud.

Classification

If a classification model is used, the tracked vehicles will be classified accordingly. If not, the dimensions of the object bounding boxes will be used to assign the classes car, van or truck from the 8+1 vehicle classification.

Output data

The vehicles in the state and event streams are always given in the local coordinate system of the corresponding traffic lane zone, not in the global coordinate system of the sensor.

State

See full API definition.

vehicles

List of vehicles currently inside the zone. Each entry contains the position, dimensions, velocity, point cloud and class of the vehicle.

Event

See full API definition.

An event is created every time a vehicle leaves a traffic lane zone.

zone_uuid

The UUID of the traffic lane zone that emitted the event.

vehicle

The vehicle leaving the zone, includes its position, dimensions, velocity, aggregated point cloud and vehicle class.

id

The unique id of the vehicle.