Using Multispectral Imaging and Artificial Intelligence to Detect Crop Diseases and Pests Early
Abstract
In sustainable agriculture, detecting pests and diseases early is critical. Recent technological advances in deep learning (DL) and multimodal imaging like multispectral and thermal data crop health monitoring is promising. Despite the progress, obtaining high accuracy across various crops with real-time performance is still a challenge. The hybrid convolutional neural network (CNN)-attention model integrating multispectral and thermal data for pest and disease detection has been introduced. A total of 1760 samples were collected from six crops (maize, rice, wheat, tomato and cassava), across different growth stages, labelled fungal, bacterial, viral and pest infections. The data was divided into 70% training, 15% validation, and 15% test sets. 3,500 samples were used for training. 750 samples were used for validation and test set. The hybrid CNN-attention model was contrasted with certain baseline models (SVM, Random Forest, CNN-RGB, CNN-Multispectral) and certain fusion methods (early, late, and hybrid fusion) based on accuracy, precision, recall, F1-score, and early detection sensitivity. The highest accuracy of 91.0% for rice at the vegetative stage was achieved by the hybrid model. It beats baseline and fusion models. The F1-score of the classification was reasonably high. Rice's sensitivity is 88.1%, and maize is 87.3%. The model fared well for all classes, getting 92.0 % for the healthy plant and 88.2 % for pest infestation. Future work can enhance the dataset with more crops and diseases and environmental factors and optimize detection time and early sensitivity for real-time deployment in agricultural decision support systems.
Related articles
Related articles are currently not available for this article.