Skip to content

ryanpram/bean-leaf-lesions-classification

Repository files navigation

Bean Leaf Lesions Classification

Problem and Project Description

Bean crops are susceptible to various diseases, and early detection of these diseases is crucial for ensuring a healthy harvest. One common issue is the development of leaf lesions, which can be caused by diseases such as angular leaf spot, bean rust, or may indicate a healthy leaf. Traditional methods of disease diagnosis can be time-consuming and may not provide timely information for effective intervention.

In this project, we aim to develop a deep learning model for the automated classification of bean leaf lesions into three classes: angular leaf spot, bean rust, and healthy. By leveraging the power of convolutional neural networks (CNNs), we intend to create a robust solution that can accurately identify and classify lesions based on images of bean leaves.

image

Dataset

The dataset of this project taken from Kaggle Dataset link. The used dataset also can download from following gdrive link : https://drive.google.com/file/d/1zyce3Y661pJ82PfCe0Kp93N2-Nkz1Var/view?usp=sharing. Alternatively can run below commmad:

!gdown --id '1zyce3Y661pJ82PfCe0Kp93N2-Nkz1Var' 

Dependencies

To run this project, you will need the following dependencies:

  • Python: 3.9.13
  • tensorflow: 2.9.1
  • tflite_runtime: 2.14.0
  • keras_image_helper: 0.0.1

Project dependencies can be installed by running:

pip install -r requirements.txt

Alternatively, can use Conda virtual environment with prepared environment.yml file:

conda env create -f opt/environment.yml

This experiment was run on saturn cloud saturn.io, using single TPU.

Model Creation

In this project, we employ the Convolutional Neural Network (CNN) architecture, specifically Xception. Xception is a deep convolutional neural network architecture that utilizes Depthwise Separable Convolutions (reference). Additionally, we apply transfer learning techniques using pre-trained weights from 'imagenet'.

Evaluation Metric

Since the used dataset in this project are balance, so we use accuracy as our evaluation metric. The formula is showed below:

$$\ \text{Accuracy} = \frac{TP + TN}{TP + TN + FP + FN} $$

Where:

  • $TP$ (True Positives): The number of samples correctly predicted as positive (correctly classified instances of the positive class).
  • $TN$ (True Negatives): The number of samples correctly predicted as negative (correctly classified instances of the negative class).
  • $FP$ (False Positives): The number of samples incorrectly predicted as positive (instances of the negative class misclassified as the positive class).
  • $FN$ (False Negatives): The number of samples incorrectly predicted as negative (instances of the positive class misclassified as the negative class).

Best Model

From the exploration, we determined that Xception model with dropout regularization modification is the best model. Best model achieving Accuracy scores of 94.7% for the validation data. Founded best param listed below:

  • learning rate: 0.01
  • inner size : 100
  • droprate : 0.8

For detailed exploration, please refer to the notebook

Model Deployment On Local

The bean leaf model is served through aws lambda function. For that purposes, the first step is prepare the script and test it on local. Below is step by step how to do it:

  1. Minimize trained model (xception_bean_leaf_16_0.947.h5) through convert it to more lightweight version in tflite model. Full implementation of the conversion can refer this notebook. The process will result an .tflite model, that can be found in here
  2. After it we need to prepare script for get the image input , throw it to the model and return the prediction score as response. The script is written on lambda_function.py. The script basically contain predict function that will preprocess input image, interprete the tflite model, put the input to the model and return the model prediction scores.
  3. Test the lambda_function with simply import the script function like below image. if its implemented correctly , it will return model prediction. For the input, its accept image url. you can host your image first on any file hosting provider or simpy use this sample image: https://drive.google.com/uc?export=view&id=1MGvOaIy94muwFCofOd88pNRszUUiwdvf. For generate image direct link in gdrive, check this url. image

Containerize Model Using Docker

For the sake of portability and easeness of model deployment, we need to wrap our model using isolated environment container (we using Docker here). The step by step:

  1. Make sure your computer already have Docker Desktop or install here.
  2. Build docker image from prepared Dockerfile script
docker build -t bean-leaf-model .
  1. Make sure the docker image already created successfully with:
docker images
  1. Run docker images on port 8080
docker run -it --rm -p 8080:8080 bean-leaf-model:latest
  1. Test the containerized model through utilizing test.py. image

Deploy model on cloud (AWS)

After successfully encapsulating our model within a Docker container, we are now prepared to deploy it to a cloud environment. In this scenario, we leverage cloud computing services offered by AWS, specifically AWS Lambda and AWS API Gateway. AWS Lambda serves as our serverless environment for hosting the model, while AWS API Gateway functions as our REST API service. When a user makes a request to the REST API, API Gateway seamlessly forwards the user's request to our Lambda service. Step by step:

  1. Create repository in AWS ECR
aws ecr create-repository --repository-name bean-leaf-tflite-images
  1. Login to our ECR repo (our container repository service provider in AWS)
aws ecr get-login-password --region <your-aws-region> | docker login --username AWS --password-stdin <your-aws-id>.dkr.ecr.<your-aws-region>.amazonaws.com/bean-leaf-tflite-images
  1. Push our created docker container on local to the our ECR repo
docker tag <image_name_in_your_local_desktop>:latest <repo_URI>/<repo_name_in_ECR>:latest
  1. Create Lambda Function with container image as our function option. Select our created ECR repo.
image
  1. Create Rest API with API Gateway. Click "Create API" button to create api and add post resource with "/predict" endpoint route. Last but not least, click "Deploy API" button
image image
  1. Test with test.py script and change the url to your created rest api endpoint.

This project already deployed to AWS that can be accessed on:

 [POST] https://9f12y6gsoe.execute-api.ap-southeast-1.amazonaws.com/production/predict

Above link can be tested also using Postman tool. image

About

Clasify bean leaf with leveraging deep learning method

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published