Authors: Julie Levy, B.Eng., and Paul St-Aubin, P.Eng., Ph.D., 

 


 

Permanent PTZ (Pan-Tilt-Zoom) traffic cameras (i.e., controllable cameras whose feed streams back to a traffic control center) are often panned, tilted, and zoomed throughout the day as operators respond to congestion, an incident or a collision. And even if operators do not move the camera, cameras can still move if the support medium they are affixed to move, typically from sway in the wind or vibrations caused by heavy vehicles transmitted through the ground and the support medium. Previously, this was a challenge with a product such as TrafxSAFE Connect which monitors road users continuously and requires recalibration anytime the camera is moved. This is because safety event data, speed information, and even movement counting is calculated based on in-camera positioning of road users; any movement of the camera renders the calibration useless and would lead to inaccurate road user counts and safety results.

To address this problem, Transoft Solutions has invested in technology which performs automatic camera movement detection and correction.

Here’s a bit about how the algorithm works: First, prominent features of the image are identified and used to “fingerprint” the background based on colors and contrast between pixels in the image. The algorithm automatically ignores features it thinks compose the foreground (moving road users). The collective panning, tilting, and zooming of these features is tracked over successive images to determine if the movement of the camera happened. Movements which are sufficiently small can automatically be corrected given that the continuous movement is known and tracked; this is only a matter of applying the transformation in reverse on the objects. However, this automatic correction only readjusts the positions of objects in the current field of view of the camera. If the camera moves so much that it no longer sees a significant portion of the original area of interest being monitored, then no amount of movement compensation can restore the missing portion of the road users’ trajectories.

 

 

If the camera movement is too great, then the system will try to match the video recordings with another pre-calibrated field of view, and data collection for that field of view will take place instead. The criteria to match a recording with any pre-calibrated field of view requires no more than 20% panning, tilting, or zooming of a calibrated field of view. If no pre-calibrated views match this requirement, then the data is simply discarded.

Discarding 1 bad hour of video streamed during a 12-hour day can improve result accuracy by up to 8.3%, therefore automatic movement detection and video discarding is a simple and effective method to solving errors from camera motion.

For more information about this blog post or to find out more about how our traffic safety video analysis products can benefit you, please feel free to  contact us directly.

 

Julie Levy, B.Eng.
Project Delivery Manager

Julie is a Jr. Transportation Engineer. She has worked closely with municipalities, organizations and engineering firms across Canada, the US and Latin America to define project scopes and ensure timely deliveries. She has worked on numerous vision-based road safety projects including evaluating safety performance at determining the effectiveness of before/after interventions using surrogate safety indicators.

 

Paul St-Aubin, P.Eng., Ph.D.
Sr. Product Manager

Transoft Solutions | Montreal, Quebec

Paul is a Transportation Engineer and Sr. Product Manager for Transoft Solutions’ Transportation Safety division as well as a post-doctoral fellow at the University of Waterloo. He has over 10 years of experience developing, deploying, and over-seeing large-scale road safety analysis technologies, including recent cutting-edge work in predictive collision course modelling. Paul specializes in several civil engineering and computer science topics, including road & safety design, driver behaviour, advanced collision detection, traffic control devices, vehicle automation, traffic data collection systems, computer vision, and ITS.