Wheeled robot measures leaf angles to help breed better corn plants

Researchers from North Carolina State University and Iowa State University have demonstrated an automated technology capable of accurately measuring the angle of leaves on corn plants in the field. This technology makes data collection on leaf angles significantly more efficient than conventional techniques, providing plant breeders with useful data more quickly. This image shows the autonomous robot, with multiple tiers of PhenoStereo cameras.
Credit: Lirong Xiang, NC State University

Researchers from North Carolina State University and Iowa State University have demonstrated an automated technology capable of accurately measuring the angle of leaves on corn plants in the field. This technology makes data collection on leaf angles significantly more efficient than conventional techniques, providing plant breeders with useful data more quickly.

“The angle of a plant’s leaves, relative to its stem, is important because the leaf angle affects how efficient the plant is at performing photosynthesis,” says Lirong Xiang, first author of a paper on the work and an assistant professor of biological and agricultural engineering at NC State. “For example, in corn, you want leaves at the top that are relatively vertical, but leaves further down the stalk that are more horizontal. This allows the plant to harvest more sunlight. Researchers who focus on plant breeding monitor this sort of plant architecture, because it informs their work.

“However, conventional methods for measuring leaf angles involve measuring leaves by hand with a protractor – which is both time-consuming and labor-intensive,” Xiang says. “We wanted to find a way to automate this process – and we did.”

The new technology – called AngleNet – has two key components: the hardware and the software.

The hardware, in this case, is a robotic device that is mounted on wheels. The device is steered manually, and is narrow enough to navigate between crop rows that are spaced 30 inches apart –the standard width used by farmers. The device itself consists of four tiers of cameras, each of which is set to a different height to capture a different level of leaves on the surrounding plants. Each tier includes two cameras, allowing it to capture a stereoscopic view of the leaves and enable 3D modeling of plants.

As the device is steered down a row of plants, it is programmed to capture multiple stereoscopic images, at multiple heights, of every plant that it passes.

All of this visual data is fed into a software program that then computes the leaf angle for the leaves of each plant at different heights.

“For plant breeders, it’s important to know not only what the leaf angle is, but how far those leaves are above the ground,” Xiang says. “This gives them the information they need to assess the leaf angle distribution for each row of plants. This, in turn, can help them identify genetic lines that have desirable traits – or undesirable traits.”

To test the accuracy of AngleNet, the researchers compared leaf angle measurements done by the robot in a corn field to leaf angle measurements made by hand using conventional techniques.

“We found that the angles measured by AngleNet were within 5 degrees of the angles measured by hand, which is well within the accepted margin of error for purposes of plant breeding,” Xiang says.

“We’re already working with some crop scientists to make use of this technology, and we’re optimistic that more researchers will be interested in adopting the technology to inform their work. Ultimately, our goal is to help expedite plant breeding research that will improve crop yield.”

The paper, “Field-based robotic leaf angle detection and characterization of maize plants using stereo vision and deep convolutional neural networks,” is published open access in the Journal of Field Robotics. Corresponding author of the paper is Lie Tang, a professor of agricultural and biosystems engineering at Iowa State. The paper was co-authored by Jingyao Gai, of Iowa State and Guanxi University; Yin Bao, of Iowa State and Auburn University; and Jianming Yu and Patrick Schnable, of Iowa State. The work was done with support from the National Science Foundation, under grant number 1625364; and from the Plant Sciences Institute at Iowa State.

Journal: Journal of Field Robotics
DOI: 10.1002/rob.22166
Method of Research: Experimental study
Subject of Research: Not applicable
Article Title: Field-based robotic leaf angle detection and characterization of maize plants using stereo vision and deep convolutional neural networks
Article Publication Date: 27-Feb-2023
COI Statement: none

Media Contact

Matt Shipman
North Carolina State University
matt_shipman@ncsu.edu
Office: 919-515-6386

Expert Contact

Lirong Xiang
NC State University
lxiang3@ncsu.edu

Media Contact

Matt Shipman
North Carolina State University

All latest news from the category: Agricultural and Forestry Science

Back to home

Comments (0)

Write a comment

Newest articles

Superradiant atoms could push the boundaries of how precisely time can be measured

Superradiant atoms can help us measure time more precisely than ever. In a new study, researchers from the University of Copenhagen present a new method for measuring the time interval,…

Ion thermoelectric conversion devices for near room temperature

The electrode sheet of the thermoelectric device consists of ionic hydrogel, which is sandwiched between the electrodes to form, and the Prussian blue on the electrode undergoes a redox reaction…

Zap Energy achieves 37-million-degree temperatures in a compact device

New publication reports record electron temperatures for a small-scale, sheared-flow-stabilized Z-pinch fusion device. In the nine decades since humans first produced fusion reactions, only a few fusion technologies have demonstrated…

Partners & Sponsors