Skip to content

jsamantaucd/BentoStatsmodel

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Serving ARIMA model with BentoML

This project shows how to apply a continuous learning ARIMA model for time-series data in BentoML to forecasts future values.

Requirements

Install requirements with:

pip install -r ./requirements.txt

Instruction

  1. Train and save model:
python ./train.py
  1. Run the service:
bentoml serve

Test the endpoint

Open in browser http://0.0.0.0:3000 to predict forecast of 5 future values.

curl -X 'POST' 'http://0.0.0.0:3000/predict' -H 'accept: application/json' -H 'Content-Type: application/json' -d '{"data": [5]}'

Sample result:

[
  21.32297249948254,
  39.103166807895505,
  51.62030696797619,
  57.742863144656305,
  57.316390331155915
]

Build Bento

Build Bento using the bentofile.yaml which contains all the configurations required:

bentoml build -f ./bentofile.yaml

Once the Bento is built, containerize it as a Docker image for deployment:

bentoml containerize arima_forecast_model:latest

About

example: add time-series models with statsmodels

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages