Deep learning algorithm and event-based image sensor combination greatly increase drone reaction times
In this week’s roundup from the Association for Unmanned Vehicle Systems International, which highlights some of the latest news and headlines in unmanned vehicles and robotics, new camera allows drones to better dodge fast-moving objects, researchers at Queensland University of Technology use drones and infrared technology to monitor animal populations in bushfire affected areas, and improvements in situational awareness of drone flights.
Drone equipped with special cameras can dodge fast-moving objects
Researchers from the University of Zurich have equipped a drone with a novel type of camera to give it the ability to detect and avoid fast-moving objects.
According to the researchers, drones that are equipped with cameras typically take 20 to 40 milliseconds to process an image and react in order to detect obstacles, which is not quick enough time to avoid a bird or another drone. It also isn't quick enough to avoid a static obstacle when the drone itself is flying at high speed.
To solve this problem, the researchers have equipped a quadcopter with special cameras and algorithms that reduced its reaction time down to a few milliseconds, which is quick enough for it to avoid a ball thrown at it from a short distance. This type of reaction time could make drones especially effective in situations such as the aftermath of a natural disaster.
“For search and rescue applications, such as after an earthquake, time is very critical, so we need drones that can navigate as fast as possible in order to accomplish more within their limited battery life,” explains Davide Scaramuzza, who leads the Robotics and Perception Group at the University of Zurich as well as the NCCR Robotics Search and Rescue Grand Challenge.
“However, by navigating fast drones are also more exposed to the risk of colliding with obstacles, and even more if these are moving. We realized that a novel type of camera, called Event Camera, are a perfect fit for this purpose.”
The researchers explain that traditional video cameras such as the ones found in smartphones work by regularly taking snapshots of the whole scene, which is done by exposing the pixels of the image all at the same time. With this technique, though, a moving object can only be detected after all the pixels have been analyzed by the on-board computer.
Event cameras, which are a recent innovation, have smart pixels that work independently of each other, and the pixels that detect no changes remain silent, while the ones that see a change in light intensit immediately send out the information, which means that only a tiny fraction of all the pixels of the image will need to be processed by the onboard computer. This speeds up the computation “a lot,” according to the researchers.
Researchers note that existing object-detection algorithms for drones do not work well with event cameras, so with this in mind, they developed their own algorithms that collect all the events recorded by the camera over a very short time, then subtracts the effect of the drone’s own movement, which typically account for most of the changes in what the camera sees.
Initially, Scaramuzza and his team tested the cameras and algorithms alone. They threw objects of various shapes and sizes towards the camera, and measured how efficient the algorithm was in detecting them. Depending on the size of the object and the distance of the throw, the success rate varied between 81 and 97 percent. The system took just 3.5 milliseconds to detect incoming objects.
Next, the researchers put the cameras on an actual drone, and threw objects directly at it while conducting indoor and outdoor flights. The drone was able to avoid the objects more than 90 percent of the time, including when a ball was thrown from a three-meter distance while traveling at 10 meters per second. Researchers say that when the drone “knew” the size of the object in advance, one camera was enough, but when it had to face objects of varying size, two cameras were used to give it stereoscopic vision.
Scaramuzza says that these results show that event cameras can increase the speed at which drones can navigate by up to ten times, which greatly expands their possible applications.
“One day drones will be used for a large variety of applications, such as delivery of goods, transportation of people, aerial filmography and, of course, search and rescue,” Scaramuzza says.
“But enabling robots to perceive and make decision faster can be a game changer for also for other domains where reliably detecting incoming obstacles plays a crucial role, such as automotive, good delivery, transportation, mining, and remote inspection with robots.”
The team would like to test this system on an even more agile quadrotor in the near future.
“Our ultimate goal is to make one day autonomous drones navigate as good as human drone pilots. Currently, in all search and rescue applications where drones are involved, the human is actually in control,” says Davide Falanga, the PhD student who is the primary author of the article.
“If we could have autonomous drones navigate as reliable as human pilots we would then be able to use them for missions that fall beyond line of sight or beyond the reach of the remote control.”
QUT researchers to use drones to identify wildlife populations in bushfire affected areas
As part of a collaborative project, researchers at Queensland University of Technology (QUT) in Queensland, Australia will use drones and infrared imaging to identify wildlife populations in bushfire affected areas. Researchers currently use this combination of technology to detect koalas.
In collaboration with Noosa and District Landcare, Associate Professor Grant Hamilton has launched a project that will use technology and artificial intelligence to create a “census” of animals that survived the bushfires.
“We know that the devastating bushfires have had a terrible impact on wildlife, but in order to help protect those which survived the terrible disaster we first need to identify the populations of animals that are still in fire affected areas,” Professor Hamilton says.
In 2019, Professor Hamilton, along with PhD student Evangeline Corcoran and Dr Simon Denman from QUT, and John Hanger and Bree Wilson from Endeavour Veterinary Ecology, co-authored a study that detailed a technique that uses drones capable of detecting heat signatures to locate koalas. The researchers created an algorithm that was designed to identify the heat signatures of koalas, which from ground level can be hard to spot.
According to Noosa and District Landcare acting general manager Rachel Lyons, koala habitat in the Noosa region at Peregian Springs, Weyba Downs, Doonan and Cooroibah have been affected by the recent bushfires.
“The main burn zones were very intense with full canopy fires, flanking areas and back burn sites varied in intensity,” Lyons says.
“Koalas were unlikely to have survived the canopy fire areas and we have held concerns about the survival of koalas in the less intense sites. Koalas have been rescued within the areas post-fire as a result of concerns for health and food availability.”
The Sunshine Coast research project was the first in what could be a wider study of koala populations in bushfire affected areas, Professor Hamilton says. Professor Hamilton is currently having discussions with other animal welfare groups to carry out drone surveys for koalas in other areas.
“We are also working on ways to extend our algorithm to automatically detect other species likely to be found in the areas,” Professor Hamilton says.
“Last year, we were able to show that our system for detecting koalas was more accurate than using expert observers. Another advantage of using drones to spot animals from the air is that we can cover areas much quicker than people on foot can conduct a search, and we can cover areas that are too inaccessible for spotters on the ground.”
As soon as cooler morning temperatures better allow for the heat sensing drone cameras to detect koalas by their heat signatures, researchers will begin their drone survey of the Noosa region fire areas.
Altitude Angel, INVOLI partner to provide 'unrivaled picture of airspace' for UAS operations
Unmanned Traffic Management (UTM) technology provider Altitude Angel is partnering with low-altitude air traffic data provider INVOLI to provide what they’re describing as “an unrivalled picture of airspace” to ANSPs, operators, pilots and drone-centric risk management applications.
After working together informally on a few initiatives, Altitude Angel and INVOLI believe a strategic partnership is beneficial for both of them. Most recently, the companies worked together at the African Drone Forum and Lake Kivu Challenge 2020, where Altitude Angel served as the lead and umbrella UTM provider, while INVOLI delivered its air traffic awareness system and drone tracking platform.
“Strategic partnerships, like the one we have formed with INVOLI, are a clear demonstration of how two businesses with a mutual vision for future drone flights can collaborate and in doing so, move an entire industry forward,” says Richard Parker, Altitude Angel, CEO and founder.
“The addition of INVOLI’s data to our already comprehensive platform will bring a new clarity to the airspace picture and in doing so, bring BVLOS flight a step closer.”
The companies will create a true situational awareness picture of the drone ecosystem by integrating INVOLI’s low-altitude air traffic awareness data stream with Altitude Angel’s airspace and ground hazard data, via its GuardianUTM platform. As a result, organizations that are adopting UTM technologies will have a ‘best-in-class’ enriched data source.
Altitude Angel says that it will also have the ability to propagate the data from INVOLI to other applications and partner platforms, including DroneSafetyMap.com, Guardian App, and bespoke national drone apps from ANSPs.
“As two of the world’s most progressive organizations whose shared aim is to open our skies safely and securely to regular, automated drone flights, I’m excited by our partnership with Altitude Angel,” says Manu Lubrano, INVOLI CEO and co-founder.
“The next few years are critical for the drone industry, and we’re pleased to have Altitude Angel at our side to together enable safe drone-use worldwide.”
Compiled by Brian Sprowl, Associate Editor, AUVSI
Share your vision-related news by contacting Dennis Scimeca, Associate Editor, Vision Systems Design
SUBSCRIBE TO OUR NEWSLETTERS