Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

adding tutorials for resnet50 and lenet5 #11

Merged
merged 20 commits into from
Mar 1, 2021
Merged
Show file tree
Hide file tree
Changes from 7 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 11 additions & 1 deletion tinyms/serving/client/client.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@
import json
import sys
import requests
import numpy as np
from PIL import Image
from tinyms.vision import mnist_transform, cifar10_transform, imagefolder_transform

Expand Down Expand Up @@ -69,4 +70,13 @@ def predict(img_path, servable_name, dataset_name="mnist"):
elif res_body['status'] != 0:
leonwanghui marked this conversation as resolved.
Show resolved Hide resolved
print(res_body['err_msg'])
else:
print(res_body['instance'])
instance = res_body['instance']
if dataset_name == "mnist":
data = mnist_transform.postprocess(np.array(json.loads(instance['data'])), strategy='TOP1_CLASS')
print("Prediction is: "+str(data))
elif dataset_name == "imagenet2012":
data = imagefolder_transform.postprocess(np.array(json.loads(instance['data'])), strategy='TOP1_CLASS')
print("Prediction is: "+str(data))
else:
data = cifar10_transform.postprocess(np.array(json.loads(instance['data'])), strategy='TOP1_CLASS')
print("Prediction is: "+str(data))
100 changes: 100 additions & 0 deletions tinyms/tutorial/en/LeNet5/LeNet5_Client_tutorial.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,100 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "exotic-condition",
"metadata": {},
"source": [
"# TinyMS LeNet5 Client Tutorial\n",
"\n",
"### In this tutorial, sending an inference request using TinyMS API to a LeNet5 server and getting the result will be demonstrated. Please make sure you lauched a LeNet5 server first, you can refer to `LeNet5_Server_tutorial.ipynb`.\n",
"\n",
"\n",
"## Steps\n",
"\n",
"### 1. Upload the pic\n",
"\n",
"In this tutorial, a LeNet5 model trained with MNIST dataset is hosted in the backend. And a picture of a single digit number is required to be the input. If using terminal, either `scp` or `wget` will do, if running in Jupyter, click `Upload` button at the top right and select the picture. The picture we use in this tutorial can be found [here](https://3qeqpr26caki16dnhd19sv6by6v-wpengine.netdna-ssl.com/wp-content/uploads/2019/02/sample_image.png).\n",
"\n",
"Save the picture to the root folder, and rename it to `7.png` (or any other name you like).\n",
"\n",
"\n",
"### 2. List servables\n",
"\n",
"Now, we can use `list_servables` function to check what model is servable right now."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "annoying-restoration",
"metadata": {},
"outputs": [],
"source": [
"from tinyms.serving import list_servables, predict\n",
"\n",
"list_servables()"
]
},
{
"cell_type": "markdown",
"id": "coastal-reference",
"metadata": {},
"source": [
"If the output `description` shows it is a lenet5 model, then we can continue to next step to send our request."
]
},
{
"cell_type": "markdown",
"id": "lasting-operations",
"metadata": {},
"source": [
"### 3. Sending request and get the result\n",
"\n",
"Run `predict` function to send the request:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "divine-prototype",
"metadata": {},
"outputs": [],
"source": [
"# predict(image_path, servable_name, dataset='mnist')\n",
"predict('/root/7.png','lenet5')"
]
},
{
"cell_type": "markdown",
"id": "unauthorized-ranch",
"metadata": {},
"source": [
"If you can see the output something similar to this: \n",
"`Prediction is: 7` \n",
"that means you successfully predict a number"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.5"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
240 changes: 240 additions & 0 deletions tinyms/tutorial/en/LeNet5/LeNet5_Server_tutorial.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,240 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "affecting-convenience",
"metadata": {},
"source": [
"# TinyMS LeNet5 Server Tutorial\n",
"\n",
"### In this tutorial, constructing a LeNet model, downloading dataset , training and and start the server of the model using TinyMS API will be demonstrated. \n",
"\n",
"## Prerequisite\n",
" - Ubuntu: `18.04`\n",
" - Python: `3.7.x`\n",
" - Flask: `1.1.2`\n",
" - MindSpore: `CPU-1.1.1`\n",
" - TinyMS: `0.1.0`\n",
" - numpy: `1.17.5`\n",
" - opencv-python: `4.5.1.48`\n",
" - Pillow: `8.1.0`\n",
" - pip: `21.0.1`\n",
" - requests: `2.18.4`\n",
" \n",
"## Introduction\n",
"\n",
"TinyMS is a high-level API which is designed for amateur of deep learning. It minimizes the number of actions of users required to construct, train, evaluate and serve a model. TinyMS also provides tutorials and documentations for developers. \n",
"\n",
"This tutorial consists of five parts, constructing the model, downloading dataset, training, define servable json and starting server, while sending a prediction request will be demonstrated in the `LeNet5_Client_tutorial.ipynb` file."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "phantom-mills",
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"import json\n",
"import tinyms as ts\n",
"import tinyms.optimizers as opt\n",
"from tinyms.data import MnistDataset, download_dataset\n",
"from tinyms.vision import mnist_transform\n",
"from tinyms.model import Model, lenet5\n",
"from tinyms.serving import start_server\n",
"from tinyms.metrics import Accuracy\n",
"from tinyms.losses import SoftmaxCrossEntropyWithLogits"
]
},
{
"cell_type": "markdown",
"id": "aggressive-medline",
"metadata": {},
"source": [
"## 1. Construct the model\n",
"\n",
"TinyMS encapsulates init and construct of the LeNet5 model, the line of the code is reduced to construct the LeNet5 model:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "matched-explanation",
"metadata": {},
"outputs": [],
"source": [
"net = lenet5(class_num=10)"
]
},
{
"cell_type": "markdown",
"id": "guided-activation",
"metadata": {},
"source": [
"## 2. Download dataset\n",
"\n",
"The MNIST dataset will be downloaded if `mnist` folder didn't exist at the root. If `mnist` folder already exists, this step will not be performed."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "laden-slovakia",
"metadata": {},
"outputs": [],
"source": [
"# download the dataset\n",
"mnist_path = '/root/mnist'\n",
"if not os.path.exists(mnist_path):\n",
" ts.data.download_dataset('mnist', '/root')\n",
" print('************Download complete*************')\n",
"else:\n",
" print('************Dataset already exists.**************')"
]
},
{
"cell_type": "markdown",
"id": "regulated-bradford",
"metadata": {},
"source": [
"## 3. Train the model & evaluation\n",
"\n",
"The dataset for both training and evaluation will be defined here, and the parameters for training also set in this block. A trained ckpt file will be saved to `/etc/tinyms/serving/lenet5` folder for later use, meanwhile the evaluation will be performed and the `Accuracy` can be checked"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "bearing-showcase",
"metadata": {},
"outputs": [],
"source": [
"# define the training and evaluation dataset\n",
"batch_size = 32\n",
"train_dataset = MnistDataset(os.path.join(mnist_path, \"train\"), shuffle=True)\n",
"train_dataset = mnist_transform.apply_ds(train_dataset)\n",
"eval_dataset = MnistDataset(os.path.join(mnist_path, \"test\"), shuffle=True)\n",
"eval_dataset = mnist_transform.apply_ds(eval_dataset)\n",
"\n",
"# parameters for training\n",
"lr = 0.01\n",
"momentum = 0.9\n",
"epoch_size = 3\n",
"net_loss = SoftmaxCrossEntropyWithLogits(sparse=True, reduction='mean')\n",
"net_opt = opt.Momentum(net.trainable_params(), lr, momentum)\n",
"net_metrics={\"Accuracy\": Accuracy()}\n",
"\n",
"model = Model(net)\n",
"model.compile(loss_fn=net_loss, optimizer=net_opt, metrics=net_metrics)\n",
"print('************************Start training*************************')\n",
"model.train(epoch_size, train_dataset)\n",
"model.save_checkpoint('/etc/tinyms/serving/lenet5/lenet5.ckpt')\n",
"print('************************Finished training*************************')\n",
"\n",
"model.load_checkpoint('/etc/tinyms/serving/lenet5/lenet5.ckpt')\n",
"print('************************Start evaluation*************************')\n",
"model.eval(eval_dataset)"
]
},
{
"cell_type": "markdown",
"id": "boolean-hazard",
"metadata": {},
"source": [
"## 4. Define servable.json\n",
"\n",
"Define the lenet5 servable json file for model name, format and number of classes for serving. "
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "colored-rolling",
"metadata": {},
"outputs": [],
"source": [
"servable_json = [{'name': 'lenet5', \n",
" 'description': 'This servable hosts a lenet5 model predicting numbers', \n",
" 'model': {\n",
" \"name\": \"lenet5\", \n",
" \"format\": \"ckpt\", \n",
" \"class_num\": 10}}]\n",
"os.chdir(\"/etc/tinyms/serving\")\n",
"json_data = json.dumps(servable_json, indent=4)\n",
"\n",
"with open('servable.json', 'w') as json_file:\n",
" json_file.write(json_data)"
]
},
{
"cell_type": "markdown",
"id": "sustained-matthew",
"metadata": {},
"source": [
"## 5. Start server\n",
"\n",
"### 5.1 Introduction\n",
"TinyMS Serving is a C/S(client/server) structure. TinyMS using [Flask](https://flask.palletsprojects.com/en/1.1.x/) whichi is a micro web framework written in python as the C/S communication tool. In order to serve a model, user must start server first. If successfully started, the server will listen to POST requests from 127.0.0.1 port 5000 sent by client and handle the requests using MindSpore backend which will construct the model, run the prediction and send the result back to the client.\n",
"\n",
"### 5.2 Start server\n",
"\n",
"run the following code block to start the server:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "convinced-theorem",
"metadata": {},
"outputs": [],
"source": [
"start_server()"
]
},
{
"cell_type": "markdown",
"id": "crucial-chick",
"metadata": {},
"source": [
"If you can see something similar to this:\n",
"```\n",
"* Serving Flask app \"tinyms.serving.server.server\" (lazy loading)\n",
" * Environment: production\n",
" WARNING: This is a development server. Do not use it in a production deployment.\n",
" Use a production WSGI server instead.\n",
" * Debug mode: off\n",
" * Running on http://127.0.0.1:5000/ (Press CTRL+C to quit)\n",
"```\n",
"that means you have successfully launched a server. Next, go to 'LeNet5_Client_tutorial.ipynb' to continue.\n",
"\n",
"## Shutdown server\n",
"\n",
"To restart server, clicke `Kernel` at the top, then click `Restart & Clear Output`\n",
"\n",
"To shutdown server, if using terminal, simply CTRL + C to shutdown serving, if running in Jupyter, click `Kernel` at the top, then click `Shutdown`"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.5"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
Loading