Environment and Agriculture

John Deere Unveils New Autonomous Machines

The new fleet incorporates updated machine vision technology with a longer perception range than the first generation technology.
Jan. 15, 2025
5 min read

John Deere recently unveiled additions to its autonomous fleet of machines and detailed the technology—AI and cameras, for example—behind a second-generation autonomy stack.

The machines automate tasks in agriculture, construction (specifically quarries) and commercial landscaping. All are in the testing phase with a limited number of customers. John Deere expects to release a second-generation autonomy kit to upgrade certain existing machines later in 2025. Company officials did not provide dates for large-scale commerical availability of the new machines.

The company launched its first autonomous machine, a tractor for tillage, in 2022. John Deere says its goal is to have a fully autonomous farming system for corn and soybean farming in the United States by 2030.

Related: John Deere Introduces Autonomous Tractor

“When we talk about autonomy, we mean full autonomy. No one is in the machine,” Jahmy Hindman, chief technology officer at Deere & Company, which markets itself as John Deere, explained at a press conference at CES 2025.

“Our autonomy solutions are doing real work for real customers, and our customers demand that the safety and quality of the work that the machines do autonomously is at least as good as if not better than the job performed when someone is sitting in the operator’s seat,” Hindman added.

The Technology Stack

The machines utilize John Deere’s second-generation technology stack with updated technology that increases the perception range from 16 to 24 m, compared with the first-generation autonomous tractor. The upgrade also allows the machines to drive 40% faster and pull implements that are two times wider.  

Related: John Deere Uses Machine Vision and Machine Learning to Automate Agriculture

Operators manage the machines via John Deere Operations Center Mobile, the company’s cloud-based platform. Through an app on a mobile phone, operators have access to live video, images, data and metrics, so they can track a machine’s progress. They also can adjust various factors like speed.

Before a machine begins running autonomously, an operator moves it to a starting location and specifies the parameters of the job.

The key difference between the autonomy stacks is the approach to imaging. The first generation utilizes pairs of cameras, which are positioned with a slight angular offset, but this approach has disadvantages, Deere officials say.

“To increase the sensing range of a system like this, we have to widen that baseline. Widening that baseline makes it harder and harder to preserve those angular orientations between the cameras,” Matt Potter, director of engineering for robotics and mobility technology at John Deere, explained at the same press conference. 

Related: Machine Vision System Tracks Vineyard's Crop Yields

Second-generation technology uses a different setup to solve this problem. “We created what we are calling camera arrays. Instead of two cameras overlapping, we have many cameras overlapping. Each camera can correct the position orientation of every camera, every single frame,” explained Willy Pell, CEO of Blue River Technology (Sunnyvale, CA, USA), a wholly owned subsidiary of Deere & Company focused on creating intelligent machinery. “This real time calibration allows us to run with wider baselines and have more accurate depth at greater range,” he added. 

Perception data is fed into customized embedded vision processing units, based on Nvidia (Santa Clara, CA, USA) Orin GPUs. The vision processing units, which are air cooled, are custom designed by John Deere to operate under harsh conditions such as extreme vibration, shock, and temperatures.

Related: Aerial Imaging Aids Precision Agriculture

The vision processing units can ingest up to 12 camera feeds, each with 8 MPixel resolution. Using neural networks, they process and classify images in milliseconds.

Details of the Autonomous Systems

The machines are outfitted with LED lights, allowing them to operate day or night.

The specific machines unveiled at the CES press conference are:

  • Autonomous 9RX Tractor for Large-Scale Agriculture, which features 16 individual cameras arranged in pods to enable a 360° view of the field.
  • Autonomous 5ML Orchard Tractor for Air Blast Spraying, which has added LiDAR sensors to help it navigate in orchards because dense tree canopies sometimes obstruct satellite-based guidance. The initial machine will have a diesel engine. A battery-powered electric tractor will be released later. The tractor is designed to spray high-value crops like fruit and nuts, which require repetitive spraying 6-8 times per growing season while driving at 2.5 mph.  
  • 460 P-Tier Autonomous Articulated Dump Truck (ADT) for Quarry Operations: The truck, with 12 cameras, can transport more than 92,000 lbs. of material from a load zone to a dump zone. Using the John Deere Operations Center app, quarry operators can specify load zones, dump zones, and haul routes.
  • Autonomous Battery Electric Mower for Commercial Landscaping: The autonomous commercial mower leverages four pairs of stereo cameras located on the front, left, right, and rear. It uses the same technology as other Deere autonomous machines but on a reduced scale since the machine has a smaller footprint. The mower is powered with sealed lithium-ion batteries and can run for up to 10 hours per cycle.

 

Training the Algorithms

Machine learning algorithms have been trained on every conceivable image. “When we first designed the system, we didn’t have insects in mind as a class of object to train on, but when you ran the machine at night with the lights on, a moth weighing less than a single ounce would bring our 40,000-pound machine to a halt,” Pell explained.  

Related: Technology is Key to the Future of Farming

He added, “Autonomy is all about solving the long tail of edge cases. Bugs being one example. We are building a tech stack that is modular, and we are spreading it laterally across our machines.”

 

About the Author

Linda Wilson

Editor in Chief

Linda Wilson joined the team at Vision Systems Design in 2022. She has more than 25 years of experience in B2B publishing and has written for numerous publications, including Modern Healthcare, InformationWeek, Computerworld, Health Data Management, and many others. Before joining VSD, she was the senior editor at Medical Laboratory Observer, a sister publication to VSD.         

Sign up for Vision Systems Design Newsletters

Voice Your Opinion!

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!