More and more, vision technologies are diverging from the factory floor into other areas. Some of these non-factory areas are experiencing the same issues as factories, one of which is a shortage of labor and a major reason John Deere recently introduced its autonomous tractor. “The food supply that farmers have to produce has to grow by 50% in next 30 years because the population of the world is expected to grow to nearly 10 billion by 2050,” says Gaurav Bansal, director of engineering and autonomy at Blue River Technology, a wholly owned subsidiary of John Deere. “But, the land remains the same. So, the farmers need to produce more out of the land. When we talk to the farmers, one of the critical problems they deal with is labor. Farmers are getting older. The average age of a farmer in the United States is 55 years old. Also, people have been migrating to urban areas. Around 3% of the land in the U.S. is defined as urban, and more than 80% of the people live in that 3%. Farms are almost all in the rural areas. The goal for us in shipping this autonomous tractor is to solve the labor shortage problem for farmers so they can focus on the more critical parts of their operations.”
The Tractor
John Deere expects to start shipping the tractor this fall. The total package comprises a John Deere 8R tractor, a TruSet™-enabled chisel plow, GPS guidance system, and other advanced technologies. According to Bansal, the fall season is the busiest time of year for farmers, whose typical work days are 16 to 18 hours.
The tractor employs a camera-based solution made up of six stereo cameras that provide 360° visibility around the tractor, what Bansal calls a safety bubble. If there is an unknown object in that safety bubble, the system stops the vehicle. Via an app running the John Deere Operations Center, which many farmers already use, a farmer is always connected to the vehicle. Before notifying him of an issue, there are multiple independent humans who review the images sent to the cloud by the tractor. If all agree that the tractor returned a false positive, they will restart the tractor. That process is completed in approximately 23 seconds. If an object is actually in front of the tractor, the farmer can intervene via the app. He would receive a notification and then, according to Bansal, have a business decision to make. “They can choose to navigate the vehicle around the object—that would mean that a small part of the field would not be complete—or they could choose to come to the field and physically move the object.”
Bansal explains that the autonomous tractor has been built on the foundation of existing John Deere systems. The company has a hardened GNSS system in place now and has been providing tractors that feature auto steering for 20 years. Auto steering allows the farmer to plan the path in the field while the tractor auto steers. “It’s done using really good GNSS through two Deere StarFire receivers on the tractors,” says Bansal. On top of these systems, John Deere built the autonomous tractor, which is based on computer vision and machine learning. “At a high level, these are the core technologies driving the autonomous tractor,” says Bansal.
Bansal says that the company has been testing and collecting data over the last three plus years. Deere employees traveled to close to 20 states and collected more than 200 million image pairs. These images were used to train a machine learning model, which is what stops the vehicle if there is an unknown object in the safety bubble built around the tractor.
A farmer will need to transport the tractor to its desired location. Once there, it uses six pairs of stereo cameras. Three pairs are mounted on the nose of the tractor, and the other three (rear facing) are mounted to the cab: one pair on the left, one pair on the right, and one pair pointed toward the rear of the tractor. The system processes images at 3 fps, meaning that every 330 ms, it processes a frame. “We have done the camera architecture in such a way that we don’t have any blind spots,” says Bansal, adding that the safety bubble extends approximately 10 meters from the vehicle.
The 200 million images are stored in the cloud. The training models happen on GPUs in the cloud or physical GPUs in the Blue River’s Sunnyvale, CA, facility. “So, it's a convolutional-neural-network-based design that has 10 million parameters.” The neural network Blue River trained takes four inputs: RGBD, according to Bansal. RGB is the color part of the image, and the D is for depth. “We need two things: we need to know what is in the image, whether it is a human or the ground, and how far away they are.” The depth is attained through the stereo cameras. Blue River was able to employ machine learning using stereo cameras to determine how far away an object is in an image. “So, for our detection learning model, these four channels go as an input,” says Bansal. The model divides the world into five categories: ground (anything that can be driven over); sky (which is infinity away); trees, which it will stop for but are often on the horizon; large objects, including vehicles, animals, buildings, rocks, etc.; and a fifth category Blue River calls an “implement.” An implement is something attached to the tractor, which it will not stop for—in this case the TruSet™-enabled chisel plow. So, the machine learning model takes the RGBD as an input and predicts which of the five categories each pixel belongs to.
Processing the images captured by the stereo cameras takes place on the tractor, which is running an NVIDIA Jetson Xavier GPU. “There is obviously a machine learning model,” says Bansal, “but there is also embedded software that is running on the hardware on the tractor. So as the images come in at 3 fps, they're going through this model, and we're getting an output.”
The autonomous tractor can run 24 hours per day, which means it also will run in the dark. According to Bansal, it actually performs better in the dark. During the day, the sun and shadows are variable depending on where you are and what direction you’re moving. The night, though, is always consistent. Operating at night, however, means lighting could be a challenge. According to Bansal, the tractor already comes with lights that provide good illumination. The same lights the farmers use to operate the tractors during their 16- to 18-hours days are sufficient. But, farmers don’t usually need to see what’s behind the implement. So, Bansal says they added lights to the rear of the implement. All of the lights on the tractor (including the ones installed at the back of the implement for the autonomous operation) are LEDs.
The pilot for the autonomous tractor started in the Fall of 2019, primarily in the upper Midwest. Now, the autonomous tractor is being used on farms today, with John Deere expecting to ship more than 10 of these tractors to farms in the Midwest for Fall 2022. “We are shipping these to help our customers as they have to increase their productivity by 50% in the next 30 years,” says Bansal. “Labor is one of the biggest problems they have and by giving them autonomous tractors, they’ll be able to run 24 hours per day. We’ll give them the opportunity to focus on more critical parts of their business.”
Chris Mc Loone | Editor in Chief
Former Editor in Chief Chris Mc Loone joined the Vision Systems Design team as editor in chief in 2021. Chris has been in B2B media for over 25 years. During his tenure at VSD, he covered machine vision and imaging from numerous angles, including application stories, technology trends, industry news, market updates, and new products.