Research Article | Open Access
Volume 2024 |Article ID 0132 | https://doi.org/10.34133/plantphenomics.0132

Automatic Root Length Estimation from Images Acquired In Situ without Segmentation

Faina Khoroshevsky ,1,9 Kaining Zhou,2,8,9 Sharon Chemweno,3,8 Yael Edan,1 Aharon Bar-Hillel,1 Ofer Hadar,4 Boris Rewald,5,6 Pavel Baykalov,5,7 Jhonathan E. Ephrath,8 and Naftali Lazarovitch8

1Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer Sheva, Israel
2The Jacob Blaustein Center for Scientific Cooperation, The Jacob Blaustein Institutes for Desert Research, Ben-Gurion University of the Negev, Sede Boqer, Israel
3The Albert Katz International School for Desert Studies, The Jacob Blaustein Institutes for Desert Research, Ben-Gurion University of the Negev, Sede Boqer, Israel
4Department of Communication Systems Engineering, School of Electrical and Computer Engineering, Ben-Gurion University of the Negev, Beer Sheva, Israel
5Institute of Forest Ecology, Department of Forest and Soil Sciences, University of Natural Resources and Life Sciences, Vienna (BOKU), Vienna, Austria
6Faculty of Forestry and Wood Technology, Mendel University in Brno, Brno, Czech Republic
7Vienna Scientific Instruments GmbH, Alland, Austria
8French Associates Institute for Agriculture and Biotechnology of Drylands, The Jacob Blaustein Institutes for Desert Research, Ben-Gurion University of the Negev, Sede Boqer, Israel
9These authors contributed equally to this work

Received 
12 Jul 2023
Accepted 
12 Dec 2023
Published
12 Jan 2024

Abstract

Image-based root phenotyping technologies, including the minirhizotron (MR), have expanded our understanding of the in situ root responses to changing environmental conditions. The conventional manual methods used to analyze MR images are time-consuming, limiting their implementation. This study presents an adaptation of our previously developed convolutional neural network-based models to estimate the total (cumulative) root length (TRL) per MR image without requiring segmentation. Training data were derived from manual annotations in Rootfly, commonly used software for MR image analysis. We compared TRL estimation with 2 models, a regression-based model and a detection-based model that detects the annotated points along the roots. Notably, the detection-based model can assist in examining human annotations by providing a visual inspection of roots in MR images. The models were trained and tested with 4,015 images acquired using 2 MR system types (manual and automated) and from 4 crop species (corn, pepper, melon, and tomato) grown under various abiotic stresses. These datasets are made publicly available as part of this publication. The coefficients of determination (R2), between the measurements made using Rootfly and the suggested TRL estimation models were 0.929 to 0.986 for the main datasets, demonstrating that this tool is accurate and robust. Additional analyses were conducted to examine the effects of (a) the data acquisition system and thus the image quality on the models’ performance, (b) automated differentiation between images with and without roots, and (c) the use of the transfer learning technique. These approaches can support precision agriculture by providing real-time root growth information.

© 2019-2023   Plant Phenomics. All rights Reserved.  ISSN 2643-6515.

Back to top