Weed Control System Uses Robotic Dog, Machine Vision, and Flamethrower
Researchers at Texas A&M University (College Station, TX, USA) have developed an experimental weed-control system for farmers using a robotic mobile dog equipped with a robotic arm, cameras, AI software, and a propane-powered flamethrower.
The robotic arm is attached to the robotic dog, and the arm grips the flamethrower. The dog/arm combination moves into position near a weed and then triggers the flame. The system doesn’t burn down weeds. Instead, the flame heats a weed’s core, which stunts its growth.
The researchers only use the edge of the flame to heat the weed because they do not want to hurt the soil.
However, developing the system was challenging. Flames are affected by dynamic conditions such as wind, gravity, atmospheric pressure, fuel tank pressure, and the pose of the nozzle. As the researchers explain in an article posted on arXiv, a preprint server (https://bit.ly/4ftiyUr), “The flame for weed removal needs a model that can capture its deformation caused by the outdoor environment and the fluctuation in gas fuel pressure.”
To help address this problem, the researchers devised a model to estimate the size and shape of the flame based on the severity of the wind.
In addition to researchers from Texas A&M University, one author is from Boston Dynamics (Waltham, MA, USA), developer of the yellow robotic dog, Spot, used in the experimental weed-control system.
Related: Vision-Guided Quadruped Robot from Boston Dynamics Now Opens Doors
Hardware Components of the System
Specifically, the hardware platform consists of a Spot Mini quadruped robot. A 6-DoF Z1 arm from Unitree Robotics (Hangzhou City, China), with a payload capacity of 2 kg, is mounted on the dog’s back. Both the dog and arm are powered by Spot’s internal battery.
A Realsense D435 RGB-D camera from Intel (Santa Clara, CA, USA) is mounted on the robotic arm’s end-effector. The images from this camera are used in weed detection and localization. Two Lepton 3.5 thermal cameras from Teledyne FLIR (Wilsonville, OR, USA) monitor flame direction and coverage. One is mounted on the end-effector while the second is attached to the front of Spot’s body.
“The arm, thermal cameras, and RGB-D camera form a hand-eye vision system, as this configuration allows the system to obtain observations of the weed scene and torch flame at different angles and distances,” they write.
Related: Thermal Ranging Technique Delivers Detailed Images
The algorithms for perception, decision-making and system control are carried out on the dog’s on-board computer, Spot Core.
The flame torch and relay-controlled lighter are mounted on the tip of the arm while a propane tank is mounted on Spot. Gas is delivered to the torch via a torch control unit.
Software Architecture
To detect a weed, the researchers developed a coarse localization algorithm using YOLOv6. With this information, Boston Dynamic’s robot planning model, developed using the Spot SDK, generates the dog’s trajectory from its starting point to a position near the weed. After Spot arrives at this destination, an algorithm developed by the researchers calculates the weed’s center using a close-up image taken with the RGB-D camera.
At the same time, another algorithm developed by the researchers estimates the size and shape of the flame’s surface using temperature-threshold data taken from thermal images. This flame estimation algorithm is based on a circular arc curve with a Gaussian cross-section model. “It is a geometric model that is similar to a bendable/twistable cone with a higher degree of freedom to reflect the flame change under windy conditions,” explains Shuangyu Xie, a study author and research assistant at the Department of Computer Science and Engineering at Texas A&M University.
Related: Filtering Techniques Eliminate Gaussian Image Noise
In the next step of the process, a manipulation planning model determines the pose of the flamethrower’s nozzle based on information about the center of the weed and the flame’s predicted surface area. A motion plan for the arm is created using the SDK from Unitree.
Validating the System with Field Experiments
To validate the flame estimation algorithm, the researchers created a dataset with 156 pairs of images of flames, which were taken using thermal cameras in an indoor environment because they could manually control the speed and direction of the wind. They manually classified the images based on those characteristics.
Related: LWIR Cameras are the Powerhouse Behind Thermal Imaging
Using those images, they compared the performance of their flame estimate algorithm with an algorithm based on a straight-line model. What they found is that the circular arc model maintained its performance in both light and strong wind conditions, while the performance of the straight-line model degraded under strong wind conditions.
The researchers then tested their weed flaming system on weeds—sunflower, giant ragweed, and small melon— in a raised bed. They conducted five trials of the entire flaming system—from motion planning to igniting the flame and heating the weeds. The average precision in targeting the weeds during the five trials was 94.4%.
The researchers plan to build on their work in future iterations of the system. “The current system has the ability to capture the flame bend due to the wind, but we did not design algorithms to cope with the windy effect after a wind direction change. We will design a follow up algorithm that adjusts flaming strategy considering dynamic wind,” Xie says.
Linda Wilson | Editor in Chief
Linda Wilson joined the team at Vision Systems Design in 2022. She has more than 25 years of experience in B2B publishing and has written for numerous publications, including Modern Healthcare, InformationWeek, Computerworld, Health Data Management, and many others. Before joining VSD, she was the senior editor at Medical Laboratory Observer, a sister publication to VSD.