New AI-driven tools improve route image segmentation

AI News


This article has been reviewed in accordance with Science X's editorial processes and policies. Our editors have ensured the reliability of the content while highlighting the following attributes:

Fact-checked

Peer-reviewed publications

A trusted source

Proofread


Developed by Berkeley Lab researchers, RhizoNet is a new computational tool that harnesses the power of AI to transform how plant roots are studied, providing new insights into their behavior under different environmental conditions. It works in conjunction with EcoFAB, a new hydroponic growing device that facilitates imaging of plants in situ by providing a detailed view of their root systems. Credit: Thor Swift, Lawrence Berkeley National Laboratory

× close


Developed by Berkeley Lab researchers, RhizoNet is a new computational tool that harnesses the power of AI to transform how plant roots are studied, providing new insights into their behavior under different environmental conditions. It works in conjunction with EcoFAB, a new hydroponic growing device that facilitates imaging of plants in situ by providing a detailed view of their root systems. Credit: Thor Swift, Lawrence Berkeley National Laboratory

In a sustainable world, it is crucial to understand the hidden part of living plants: their roots. Roots are more than just anchors: they are the dynamic interface between the plant and the soil, and are crucial for water absorption, nutrient uptake, and ultimately the plant's survival.

In their search to increase agricultural yields and develop crops that are more resilient to climate change, scientists in Lawrence Berkeley National Laboratory's (Berkeley Lab) Applied Mathematics and Computational Research (AMCR) and Environmental Genomics and Systems Biology (EGSB) divisions have made a major breakthrough. Their latest innovation, RhizoNet, harnesses the power of artificial intelligence (AI) to transform the way plant roots are studied, providing new insights into root behavior under different environmental conditions.

The pioneering tool is detailed in a study published June 5. Scientific Reportsrevolutionizes root image analysis by automating the process and achieving exceptional accuracy. Traditional methods are laborious, error-prone and inadequate to address the complex and tangled nature of root systems.

RhizoNet introduces cutting-edge deep learning approaches to enable researchers to accurately track root growth and biomass. Using an advanced deep learning-based backbone based on convolutional neural networks, this new computational tool semantically segments plant roots to assess comprehensive biomass and growth, transforming the way laboratories analyze plant roots and advancing efforts towards autonomous laboratories.

“RhizoNet's ability to standardize root segmentation and phenotyping represents a major advancement in the systematic and rapid analysis of thousands of images. This innovation will aid in our ongoing efforts to increase the accuracy of capturing root growth dynamics under diverse plant conditions,” explained Berkeley Lab's Daniella Ushijima, principal investigator for the AI-driven software.

Getting to the root

Root analysis has traditionally relied on flatbed scanners and manual segmentation methods, which are not only time-consuming but also error-prone, especially in large-scale studies involving multiple plants. Additionally, root image segmentation poses significant challenges due to natural phenomena such as air bubbles, droplets, reflections, and shadows.

The automated analysis process is further complicated by the complex nature of root structure and the presence of a noisy background. These complexities are particularly acute at small spatial scales, where fine structures may only be pixel-wide, making manual annotation extremely challenging even for experienced human annotators.

EGSB recently announced the latest version (2.0) of EcoFAB, a new hydroponic cultivation device that facilitates in situ imaging of plants by providing detailed images of their root systems. Developed in collaboration between EGSB, the DOE Joint Genome Institute (JGI), and Berkeley Lab's Climate and Ecosystem Sciences Division, EcoFAB is part of an automated laboratory system designed to run artificial ecosystem experiments that increase data reproducibility.

RhizoNet processes color scans of plants grown in EcoFABs that have received specific nutritional treatments, addressing the scientific challenge of plant root analysis. It employs a sophisticated Residual U-Net architecture (an architecture used in semantic segmentation that improves on the original U-Net by adding residual connections between input and output blocks within the same level (resolution) in both the encoder and decoder pathways) to achieve root segmentation specifically adapted to EcoFAB conditions, significantly improving prediction accuracy.

The system also integrates a convexification procedure that encapsulates identified roots from the time series and helps to quickly delineate key root components from a complex background. This integration is key to accurately monitor root biomass and growth over time, especially in plants grown under different nutrient treatments in EcoFAB.

To illustrate this, the new paper details how researchers used EcoFAB and RhizoNet to process scans of the roots of Brachypodium distachyon (a small grass) that were exposed to different nutrient-deficiency conditions over a period of about five weeks. These images, taken every three to seven days, provide important data to help scientists understand how roots adapt to different environments. The high-throughput nature of EcoBOT, a new image acquisition system for EcoFAB, allows the research team to systematically monitor the experiment as long as they analyze the data quickly.

“EcoBOT has made a major contribution to reducing the manual work involved in plant growing experiments, and now RhizoNet is reducing the manual work involved in analyzing the data generated by it,” said Peter Andile, research scientist at EGSB and lead developer of EcoBOT, who collaborated with Ushijima on the research. “This increases throughput and brings us closer to our goal of a self-driving lab.”

Ushijima noted that resources from the National Energy Research Scientific Computing Center (NERSC), a U.S. Department of Energy (DOE) user facility at Berkeley Lab, were used to train RhizoNet and perform inference, bringing this computer vision capability to EcoBOT.

“Although EcoBOT can automatically collect images, it was not able to determine how plants respond to various environmental changes, whether they are alive or dead, growing or dying,” Ushijima explains. “By measuring roots with RhizoNet, we can obtain detailed data on root biomass and growth, not only to determine plant vitality, but also to provide comprehensive, quantitative insights that are not easily observable by traditional methods. After the model is trained, it can be reused for multiple experiments (with plants unseen).”

“To analyze the complex plant images from EcoBOT, we created a novel convolutional neural network for semantic segmentation,” added Zineb Soludo, a computer systems engineer at AMCR who works as a data scientist on the project.

“Our goal was to design an optimized pipeline that uses prior information about the time series to improve the accuracy of our model beyond manual annotation done on a single frame. RhizoNet processes noisy images and allows us to detect plant roots from the images and calculate biomass and growth.”

One patch at a time

While tuning the model, we found that the model's performance improved significantly when we used smaller image patches, where the receptive field of each neuron in the early layers of the artificial neural network is smaller, allowing the model to capture fine details more effectively and enrich the latent space with diverse feature vectors.

This approach not only improves the model's ability to generalize to unseen EcoFAB images, but also makes the model more robust, allowing it to focus on thin objects and capture complex patterns despite a range of visual artifacts.

Smaller patches also help prevent class imbalance by filtering out sparsely labeled patches (patches with less than 20% annotated pixels, primarily background). The team's results show higher precision, precision, recall, and intersection over union (IoU) with smaller patch sizes, indicating an improvement in the model's ability to distinguish roots from other objects and artifacts.

To validate root prediction performance, the paper compares predicted root biomass with actual measurements. Linear regression analysis reveals significant correlations, indicating that automated segmentation is more accurate than manual annotation, which often struggles to distinguish thin root pixels from similar looking noise. This comparison highlights the challenges faced by human annotators and demonstrates the advanced capabilities of the RhizoNet model, especially when trained on small patch sizes.

The authors note that this study demonstrates the practical application of RhizoNet in modern research environments and lays the foundation for future innovations in sustainable energy solutions, as well as carbon sequestration technologies using plants and microbes. The research team is optimistic about the implications of their findings.

“Our next step is to refine RhizoNet's capabilities to further improve plant root detection and branching patterns,” Ushijima says, “and we also envision the possibility of adapting and applying these deep learning algorithms to the investigation of soil roots and new materials science.”

“We are looking at iterative training protocols, hyperparameter optimization, and leveraging multiple GPUs. These computational tools are designed to help science teams analyze a range of experiments captured as images, and can be applied across multiple disciplines.”

For more information:
Zineb Sordo et al., RhizoNet segments plant roots, assesses biomass and growth for the self-driving lab. Scientific Reports (2024). Publication date: 10.1038/s41598-024-63497-8

Journal Information:
Scientific Reports



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *