Professor Arno Ruckelshausen from theUniversity of Applied Sciences (Osnabrück, Gemany) has developed a vision-based robotic system that could be used for plant phenotyping -- the assessment of plant traits such as leaf area, stem diameter and plant height and width.
To do so, Professor Ruckelshausen has equipped a four-wheeled BoniRob robot platform developed by Amazonen-Werke H. Dreyer with several sensors, such as spectral imaging systems, light curtains, 3-D time of flight cameras and laser distance sensors to measure the morphological and spectral plant parameters in rows of plants such as maize.
.
Having created the robotic system, Professor Ruckelshausen and his team have now built a software simulator to enable them to optimize the performance of the imaging sensors that are used on the robot.
The simulator was developed in Microsoft Robotics Studio, a software package typically used for 3D robot navigation purposes, which Professor Ruckelshausen and his team used to create software models of all of the imaging sensors on the vehicle as well as the navigation system.
Just as the real robot acquires and stores data, the simulator also stores the data acquired from its virtual sensors on a MySQL data base that can then be analyzed further with sensor fusion algorithms to determine plant parameters such as height, number of leaves or biomass.
To enable to researchers to calibrate the simulator, a CAD model of a real maize plant was generated. The virtual maize plant, which was also implemented in the Robotics Studio software, was then used to create an artificial maize plant with a rapid prototyping machine.
The nearly identical virtual and artificial maize plants now enable the researchers to compare the results from the simulator with those taken from the robot as it images the model. In doing so, they hope to be able to conduct virtual experiments that will allow them to optimize the arrangement of the sensors on the robot and evaluate their performance before trialing any modifications in the field.
Eventually, the researchers believe that the robot platform will be deployed to replace the current time consuming and costly manual method of genotyping.
Related articles from Vision Systems Design that you might also be interested in.
1.Vision guided robot enters risky areas
Engineers at Toshiba Corporation (Tokyo, Japan) have developed a tetrapod robot that is able to carry out investigative and recovery work in locations that are too risky for people to enter, such as Tokyo Electric Power Plant Fukushima No.1 nuclear power plant.
2.MIT robot navigates using Microsoft's Kinect
Researchers at the at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL; Cambridge, MA, USA) have developed a robot that uses Microsoft’s Kinect to navigate through its surroundings.
3.Vision guides robot to clean air ducts
A vision-guided belt robot call Jetty, developed by Neovision (Prague, Czech Republic), can clean and inspect air-conditioning ducting, kitchen or industrial air vents, and spaces where cleaning is an unpleasant or difficult task.
-- Dave Wilson, Senior Editor,Vision Systems Design