Smart City Success Through Connected Sensors and Edge Analytics
Andrea Sorri
Andrea Sorri is Segment Development Manager Smart Cities EMEA at Axis Communications.
A larger proportion of the world’s population is living in cities than ever before, and the trend shows little sign of slowing. Despite the short-term effect of the pandemic encouraging some people to look at a more rural lifestyle, in reality this is an option only available to a minority of people. According to the United Nations, despite urban areas only covering around 2% of the planet’s surface, it is anticipated that 68% of people will live in a city by 2050.
It’s a complex picture. Numerous factors are interlinked to make cities attractive to citizens and businesses. The all-important factor of “liveability” is something that city administrators strive for, and though it may seem like an intangible element, it is actually based on a number of very measurable aspects. Safety and security, mobility and transportation, air quality, noise pollution, employment opportunities, green spaces—all of these combine to enhance a city’s liveability.
Increasingly, city authorities are looking to technology and data to help them meet their objectives (often linked to the UN’s Sustainable Development Goals). So-called “smart cities” are those that employ connected devices, sensors, systems, and data—breaking down silos between city departments—to deliver on their stated aims and goals.
Advances in technology, particularly in video surveillance with advanced edge analytics combined with multiple types of environmental sensors, have become foundational within forward-thinking urban environments.
Analytics at the edge of the network
Forensically detailed video data, analyzed in milliseconds and creating alerts for operators or triggering actions, can make a significant difference in emergency services’ incident response times. The new breed of surveillance cameras enabled with advanced deep learning-based analytics placed at the “edge” of the network, within the surveillance camera itself add this ability.
Edge analytics bring scalability and flexibility. Analyzing video within the camera and as close as possible to its capture means that the images being reviewed are of the highest possible quality. The images are also analyzed instantaneously—there is a vastly reduced latency in applying analytics within the camera itself.
Edge analytics is a valuable complement to server- and cloud-based analytics. While analytics in the camera itself brings advantages in speed and accuracy by analyzing images close to their capture, it also creates valuable metadata. When this is combined with image data and analyzed in a data center or the cloud, numerous new insights can be found. First, however, let’s look at some smart city use cases.
The multiple benefits of improved traffic management
There isn’t a city around the world without the challenge of effective traffic management. While many authorities are looking at ways of reducing overall levels of traffic in cities—and certainly trying to change the nature of traffic to more sustainable and environmentally friendly forms—effectively moving vehicles around is central to the efficient working of a city.
Road signage is an essential tool for managing traffic in cities, alerting drivers to hazards and blockages and offering alternate routes and updates on delays. Video surveillance analytics can support the automation of signage, but it relies on accuracy. Traditionally, traffic analytics has suffered from a high number of false alarms. Something as simple as a puddle or shadow on the road could be mistaken for a vehicle stopped on the carriageway.
Edge analytics can deliver the same level of accuracy in the detection, identification, and classification of all types of objects as server- and cloud-based analytics. A generic “vehicle” becomes a car, bus, or motorcycle and, critically, objects that aren’t relevant can be safely ignored.
As an example, Atlanta, Georgia, has employed surveillance cameras and analytics to create a “smart corridor” for traffic on one of the main midcity routes. A specific goal of the project is to improve the quality of life in the city by reducing emissions and pollution on a route that carries almost 29,000 vehicles every day.
The system uses Citilog’s (Paris, France; www.citilog.com) analytics application installed on 84 Axis (Lund, Sweden; www.axis.com) surveillance cameras covering 26 intersections along the 2.3-mile route. Traffic data—including vehicle counts, speed, and occupancy —is used by SURTRAC, an adaptive traffic signal control technology developed at the Robotics Institute at Carnegie Mellon University (Pittsburgh, PA, USA; www.ri.cmu.edu) that optimizes the performance of traffic signals.
In enabling real-time adjustments of traffic signals, the solution improves travel time and reduces waiting time at intersections. Over the longer term, the data recorded allow city planners to optimize traffic operations through offline analysis.
The peripheral impacts of poor traffic management
Anyone living in an urban area where traffic is poorly managed will know that it causes more than just delays. Deterioration of air quality and noise pollution also result and both can have a significant impact on the health and well-being of citizens.
Slow-moving vehicles and those sitting in traffic jams are one of the biggest contributors to poor air quality in cities. Highly sensitive air quality sensors can give city authorities real-time insight into where pollution is increasing above normal or desirable levels. When combined with video surveillance, the reason for the deterioration in air quality can be identified and appropriate actions taken.
In combination with the traffic management systems mentioned earlier, traffic could be rerouted to alleviate the immediate issue. Over time, city planners can use analysis of aggregated data to define long-term solutions to ongoing air quality issues.
Noise: an essential part of city life, to a point
While air quality is one of the more visible issues in cities, whether from traffic or other industrial activity, less well understood are the impacts of noise pollution on health and wellbeing.
Intuitively, we understand and expect cities to be relatively noisy environments. Sustained levels of noise at either a higher level than is desirable or of a certain type can be extremely detrimental to health. In fact, it comes as a surprise to many that noise pollution is the second most damaging environmental threat to human health in urban environments—respiratory agitation, high blood pressure, gastritis, colitis, and even heart attacks have all been linked to noise pollution.
Far from the traditional “dumb” decibel meters, today’s advanced acoustic sensors feature vast numbers of small high-quality microphones and can create an accurate visualization of where a sound is coming from, how loud it is, the nature of the sounds (e.g., continuous sound vs. pulsing or interrupted) at what point that volume drops off, and even more.
By utilizing such technology, smart cities can pinpoint the source of a particular noise. Connected video surveillance used in conjunction can help identify whether the noise is coming from a construction site, a busy intersection, or a nightclub operating beyond its licensed hours, allowing an appropriate response to be triggered.
Citizen safety and security
Mobility, air quality, and noise are factors highlighted by citizens in their impression of a city’s liveability. Feeling safe and secure is often high on the list, as well.
Video surveillance has played a role in law enforcement, security, and safety for many years. The arrival of edge analytics and increasingly sophisticated sensors are delivering more value to those who are focused on keeping a city safe and secure.
Interestingly, many of the application areas already mentioned also apply to safety and security. Efficient traffic management is essential in ensuring roadways are clear for emergency services vehicles when needed. Acoustic sensors can create alerts in relation to specific noises, from car accidents to raised voices and from gunshots to breaking glass.
Edge analytics can monitor a video stream, spotting anomalies, unusual patterns, specific objects, suspicious behavior, or alerts triggered by other sensors and quickly bring an operator’s attention to the scene. Intervention can then be triggered through emergency service or via audio speakers on site, either warning criminals that they’re being watched or offering assistance, advice, and guidance to people at the scene. Quicker awareness of incidents leads to a more rapid and effective response.
The benefits of data (and metadata)
Not only does edge analytics turn video information into data, which is what’s largely used in scene analytics. It also creates metadata, essentially data about the video data.
The combination of data and metadata created by edge analytics, when combined with analytics on the server or in the cloud, can be hugely useful in helping analyze enormous amounts of information collected over time. This will help authorities gain insights into areas of interest, what might be referred to as “what they know they don’t know” or “known unknowns”.
For instance, how many times have cars blocked bus lanes in the past month? Or what’s the average number of people entering a station between 7:00 a.m. and 9:00 a.m. on a weekday morning? Though authorities don’t know the answer, they know what they’re looking for.
While this capability moves the benefits of edge analytics forward once more, possibly the greatest value will come through the “unknown unknowns” when analytics start delivering insights into what authorities didn’t know they didn’t know.
This is where the true potential of edge analytics in video surveillance and connected sensors lies: the analysis over time of vast amounts of data leading to the identification of patterns and their anomalies and enabling as yet unforeseen improvements in safety, the environment, and mobility.