1AImotion Bavaria, Technische Hochschule Ingolstadt, Ingolstadt, Germany
2Hochschule Weihenstephan-Triesdorf, Weidenbach, Germany
3Saatzucht Josef Breun GmbH and Co. KG, Herzogenaurach, Germany
4Technical University of Munich, Munich, Germany
Received 25 Jan 2023 |
Accepted 19 Jun 2023 |
Published 14 Jul 2023 |
Fusarium head blight (FHB) is one of the most prevalent wheat diseases, causing substantial yield losses and health risks. Efficient phenotyping of FHB is crucial for accelerating resistance breeding, but currently used methods are time-consuming and expensive. The present article suggests a noninvasive classification model for FHB severity estimation using red–green–blue (RGB) images, without requiring extensive preprocessing. The model accepts images taken from consumer-grade, low-cost RGB cameras and classifies the FHB severity into 6 ordinal levels. In addition, we introduce a novel dataset consisting of around 3,000 images from 3 different years (2020, 2021, and 2022) and 2 FHB severity assessments per image from independent raters. We used a pretrained EfficientNet (size b0), redesigned as a regression model. The results demonstrate that the interrater reliability (Cohen’s kappa, κ) is substantially lower than the achieved individual network-to-rater results, e.g., 0.68 and 0.76 for the data captured in 2020, respectively. The model shows a generalization effect when trained with data from multiple years and tested on data from an independent year. Thus, using the images from 2020 and 2021 for training and 2022 for testing, we improved the score by 0.14, the accuracy by 0.11, κ by 0.12, and reduced the root mean squared error by 0.5 compared to the best network trained only on a single year’s data. The proposed lightweight model and methods could be deployed on mobile devices to automatically and objectively assess FHB severity with images from low-cost RGB cameras. The source code and the dataset are available at https://github.com/cvims/FHB_classification.