Elsherbiny O, Gao J M, Guo Y N, Tunio M H, Mosha A H. Fusion of the deep networks for rapid detection of branch-infected aeroponically cultivated mulberries using multimodal traits. Int J Agric & Biol Eng, 2025; 18(2): 75–88. DOI: 10.25165/j.ijabe.20251802.8666
Citation: Elsherbiny O, Gao J M, Guo Y N, Tunio M H, Mosha A H. Fusion of the deep networks for rapid detection of branch-infected aeroponically cultivated mulberries using multimodal traits. Int J Agric & Biol Eng, 2025; 18(2): 75–88. DOI: 10.25165/j.ijabe.20251802.8666

Fusion of the deep networks for rapid detection of branch-infected aeroponically cultivated mulberries using multimodal traits

  • Automatic diagnosis of diseases in aeroponically cultivated branches is crucial for enhancing the efficacy of root development and overall plant survivability during propagation. Deep learning and visible imaging offer potential for precise health assessment, despite challenges in feature selection and model design, impacting diagnostic accuracy and effectiveness. The primary objective of this study is to explore a hybrid deep network that integrates multimodal data, such as texture and color attributes, as well as image color modes, to accurately detect the presence of mildew on mulberry branches. The proposed framework incorporates a Convolutional Neural Network (CNN) and Gated Recurrent Units (GRU). Various color modes were utilized, including grayscale, RGB (Red-Green-Blue), HSV (Hue-Saturation-Value), and CMYK (Cyan-Magenta-Yellow-Black). The traits based on RGB consist of nineteen vegetation color indices (VIs) and six texture variables obtained from the gray-level co-occurrence matrix (GLCM). The outcomes demonstrated that the CNNCMYK-GRUf network effectively integrates CMYK image data and color-texture features for tracking mulberry branch health during aeroponic propagation. It achieved a validation accuracy (Ac) of 99.50%, with classification precision (Pr), recall (Re), and F-measure (Fm) at the same level. Additionally, it obtained an intersection over union (IoU) of 98.90% and a loss value of 0.034. This network exhibited superior performance compared to the model that relied solely on individual image attributes, surpassing other deep networks such as Vision Transformers (Ac=94.80%), Swin Transformers (Ac=89.80%), and Multi-Layer Perceptrons (Ac=88.30%). Thus, the proposed methodology is capable of precisely assessing the health of mulberry shoots, enabling the swift deployment of intelligent aeroponic systems. Furthermore, adapting the developed model for mobile platforms could enhance its accessibility and promote sustainable, efficient agricultural practices.
  • loading

Catalog

    Turn off MathJax
    Article Contents

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return