Drone measures plant stress and nutrient contents with multispectral camera
A partnership between vision-guided unmanned aerial vehicle (UAV) company QuestUAV and drone sensor and analytics company MicaSense has enabled the acquisition of multispectral images to monitor the health, stress, and nutrient content of crops.
Together with the MicaSense RedEdge multispectral camera, the Q-11 DATAhawk UAV is already being used worldwide in agricultural and fruit farming applications. The fixed wing drone can cover more than a square mile in a single flight; providing high-quality multispectral data for crop monitoring and analysis in an automated workflow.
The RedEdge camera captures images in blue, green, red, red edge, and near-infrared, and its narrowband optical filters provide full imager resolution for each band. The camera features a global shutter image sensor for distortion-free results and a capture rate of 1 capture per second of all bands. The hand-launchable Q-100 DATAhawk is a rugged UAV that can fly for up to 55 minutes and features multiple landing options, including automatic and parachute landing, allow for an easy and safe operation in open and confined environments.
QuestUAV’s flight planning software enables users to plan a flight mission from autonomous take-off through site surveying at specific height and image overlap until autonomous landing. Images are acquired as soon as the UAV approaches survey height and starts flying the grid lines. Every secondm the sensor captures five images (one for each band) and writes them directly to the drone’s internal SD card. The ground sampling distance at 400 ft. is 3.5 in., and geo-coordinates are directly written to the exchangeable image file format.
Following flight, images can be directly uploaded from the SD card to MicaSense ATLAS for storage, photogrammetric processing, analysis, and presentation. Datasets are processed as soon as the upload is completed, and outputs are available within 24 hours. ATLAS is a cloud-based solution that enables users to automatically process multispectral data and extract multiple outputs such as orthomosaics, vegetation index maps (e.g. NDVI and NDRE) and Digital Surface Models (DSMs). Each layer is reflectance-calibrated and registered at the sub-pixel level, with the value for each pixel indicative of percent reflectance for that band, providing valuable information on crop health at all stages of growth, according to QuestUAV.
Once processing is finished, all layers can be viewed in a secure web-based map interface. Datasets are organized in a ‘Farms and Fields’ structure. Users can create field boundaries online and ATLAS automatically organizes the data into the field boundaries. The map interface allows users to view RGB orthomosaics, and index maps in a multi-layer stack as well as to scout the field and share information with their farm management team.
With the map interface of ATLAS, users can view and analyse data from any online device, enabling farmers and agronomists with the ability to monitor crop status and plant health over time in order to detect patterns correlated to crop vigour, plant stress and nutrient content.
View more information on QuestUAV.
View more information on MicaSense.
Share your vision-related news by contacting James Carroll, Senior Web Editor, Vision Systems Design
To receive news like this in your inbox, click here.
Join our LinkedIn group | Like us on Facebook | Follow us on Twitter
Learn more: search the Vision Systems Design Buyer's Guide for companies, new products, press releases, and videos
About the Author
James Carroll
Former VSD Editor James Carroll joined the team 2013. Carroll covered machine vision and imaging from numerous angles, including application stories, industry news, market updates, and new products. In addition to writing and editing articles, Carroll managed the Innovators Awards program and webcasts.