Repurposed traffic cameras anonymously track social distancing during COVID outbreak
In 2020 when England enacted a lockdown to slow the spread of COVID-19, a company that had installed thousands of sensors equipped with artificial intelligence (AI) software into traffic lights wondered whether the technology could analyze foot traffic patterns before and after the lockdown.
Vivacity Labs (London, England, UK; www.vivacitylabs.com) develops AI sensors that recognize and analyze the traffic patterns of vehicles, bicyclists, and pedestrians (Figure 1). The company has deployed its sensors into traffic lights in 30 cities and towns in the U.K. including Cambridge, London, Manchester, Nottingham, and Oxford.
The sensors process video locally to generate data that informs the development of reactive or smart traffic lights. A 4G LTE modem built into the sensor taps into local cellular networks and transmits data to an Amazon Web Services (Seattle, WA, USA; aws.amazon.com) or Google (Mountain View, CA, USA; about.google) Cloud Platform, after which the footage deletes from the sensor. The data then transmits from the cloud to servers at Vivacity Labs.
“The data might be how many cars the sensor saw in the last five minutes, all the way down to what path did the pedestrians take as they walked along the road,” says Peter Mildon, co-founder and COO at Vivacity. “It’s all anonymous data, rather than a video stream.”
Related: Deep learning system powers traffic enforcement system
The sensors mount between five and ten meters off a lamp column and have a fixed lens typically between 2.8 and 6 mm in size, though sometimes as large as 25 mm to meet depth of field requirements. The off-the-shelf lenses come from a variety of different manufacturers, says Mildon. An NVIDIA (Santa Clara, CA, USA; www.nvidia.com) Jetson GPU running the video processing and machine learning algorithms and a 1080p board-level camera represent the other key components of the sensors.
High-end, high-resolution cameras aren’t required in the application because the algorithm doesn’t look for identifying details, for instance the specific make and model of a car. As long as the sensor recognizes that a person walks down the street or a car drives down the road, that’s good enough to serve its machine learning algorithms.
Vivacity Labs adapted an open source convolutional neural network (CNN) for the detection algorithms that determine whether an object in view is an automobile, bicycle, or pedestrian (Figure 2). The company trained this CNN on a proprietary dataset of a few hundred thousand images that the company annotated over a five-year period. The images were gathered from a large variety of sites via a strategic partnership with a traffic survey company. A proprietary algorithm tracks the locations and determines the paths of each identified object.
Traditionally, the sort of information on pedestrian traffic gathered by Vivacity Labs’ sensors includes the paths taken when people crossed roads. The company could then produce an image of a particular road or intersection upon which paths were drawn showing those pedestrian routes. This data necessarily includes the position and time of every pedestrian within the sensor’s field of view.
When the COVID-19 lockdown took place, Vivacity realized that their sensor network could not only gauge the effect of the lockdown on road usage, but also the effectiveness of the U.K.’s social distancing protocols. This effort would not require the same sort of position and time data required to chart pedestrian paths, however. The distance between the pedestrians when their paths cross, not where specifically within the camera’s field of view their paths cross, matters for this behavioral analysis.
Related: Machine vision cameras power multi-object tracking system
The binary social distancing algorithm uses data gathered through post-processing the usual pedestrian data captured by the sensors. Each time a pair of pedestrian paths draw close to one another, the algorithm measures the minimum distance between them when the paths draw closest. A distance of less than 2 m counts as a close interaction, and a distance of 2 m or greater counts as a not close interaction.
For any given time period the algorithm can determine the total number of interactions and what proportions of those interactions were close or not close. When the proportions during the lockdown period compare to the proportions after the lockdown lifted, the company can analyze the effects of the lockdown on pedestrian behavior. Or to put it more simply, answer the question of whether pedestrians followed social distancing protocols during the lockdown.
The answer generally is yes, though mitigating factors make the data gathered by Vivacity only a general estimate. At one point during the lockdown, for example, the British government changed their recommendation from maintaining 2 m minimum distance to maintaining 1 m distance. This complicates data comparisons during the time period that people made the adjustment. Sidewalks less than 2 m wide may generate close interactions that were unavoidable and therefore do not necessarily indicate laxity of social distancing.
The Department of Transport and other government bodies have previously received data from Vivacity about social distancing, as part of the British government’s monitoring of the impact of COVID-19.
About the Author
Dennis Scimeca
Dennis Scimeca is a veteran technology journalist with expertise in interactive entertainment and virtual reality. At Vision Systems Design, Dennis covered machine vision and image processing with an eye toward leading-edge technologies and practical applications for making a better world. Currently, he is the senior editor for technology at IndustryWeek, a partner publication to Vision Systems Design.