Skip to content

Plant Disease Global-Local Features Fusion Attention Model

Notifications You must be signed in to change notification settings

abelchai/PlantAIM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

27 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PlantAIM: A New Baseline Model Integrating Global Attention and Local Features for Enhanced Plant Disease Identification

PlantAIM

Proposed PlantAIM architecture.

The contributions of this paper:

  1. We introduce novel Plant Disease Global-Local Features Fusion Attention model (PlantAIM), which combines ViT and CNN components to enhance feature extraction for multi-crop plant disease identification.
  2. Our experimental results demonstrate PlantAIM's exceptional robustness and generalization, achieving state-of-the-art performance in both controlled environments and real-world scenarios.
  3. Our feature visualization analysis reveals that CNNs emphasize plant patterns, while ViTs focus on disease symptoms. By leveraging these characteristics, PlantAIM sets a new benchmark in multi-crop plant disease identification.

Acc Result

Acc Results

Grad-CAM visualization result

tomato Results cherry Results apple Results

Preparation

Implementations

PlantAIM (2H) >> pytorch implementation code

PlantAIM (1H) >> pytorch implementation code

Notes

  • The csv file (metadata of images) are here

See also

  1. Pairwise Feature Learning for Unseen Plant Disease Recognition: The first implementation of FF-ViT model with moving weighted sum. The current work improved and evaluated the performance of FF-ViT model on larger-scale dataset.
  2. Unveiling Robust Feature Spaces: Image vs. Embedding-Oriented Approaches for Plant Disease Identification: The analysis between image or embedding feature space for plant disease identifications.
  3. Beyond-supervision-Harnessing-self-supervised-learning-in-unseen-plant-disease-recognition: Cross Learning Vision Transformer (CL-ViT) model that incorporating self-supervised learning into a supervised model.

Dependencies

Pandas == 1.4.1
Numpy == 1.22.2
torch == 1.10.2
timm == 0.5.4
tqdm == 4.62.3
torchvision == 0.11.3
albumentations == 1.1.0

License

Creative Commons Attribution-Noncommercial-NoDerivative Works 4.0 International License (“the CC BY-NC-ND License”)

Citation

@article{chai2025plantaim,
  title={PlantAIM: A New Baseline Model Integrating Global Attention and Local Features for Enhanced Plant Disease Identification},
  author={Chai, Abel Yu Hao and Lee, Sue Han and Tay, Fei Siang and Go{\"e}au, Herv{\'e} and Bonnet, Pierre and Joly, Alexis},
  journal={Smart Agricultural Technology},
  pages={100813},
  year={2025},
  publisher={Elsevier}
}