Skip to content

feature: add support for online ML models from River library in BentoML

Notifications You must be signed in to change notification settings

jsamantaucd/BentoRiverModel

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Serving a model from river with BentoML

This project shows how to train a model using river online machine learning library river and log the model using MLflow's custom python function mlflow.pyfunc.log_model and import the model in BentoML model store for model serving.

Requirements

Install requirements with:

pip install -r ./requirements.txt

Instruction

  1. Train and save model:
python ./train.py
  1. Run the service:
bentoml serve

Test the endpoint

Open in browser http://0.0.0.0:3000 to predict a value.

curl -X 'POST' 'http://0.0.0.0:3000/predict' \
     -H 'accept: application/json' \
     -H 'Content-Type: application/json' \
     -d '{
        "ordinal_date": 836489,
        "gallup": 47.843213,
        "ipsos": 48.07067899999999,
        "morning_consult": 52.318749,
        "rasmussen": 50.104692,
        "you_gov": 58.636914000000004
        }'

Sample result:

42.74910074533503

Build Bento

Build Bento using the bentofile.yaml which contains all the configurations required:

bentoml build -f ./bentofile.yaml

Once the Bento is built, containerize it as a Docker image for deployment:

bentoml containerize river_arf_model:latest

About

feature: add support for online ML models from River library in BentoML

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages