Releases: bentoml/BentoML
BentoML-0.4.0 Beta
-
Redesigned deployment component available now, take a look at the deploy command:
bentoml deployment --help
-
Multiple image support in ImageHandler
-
Yatai Service Beta Release - a new component in BentoML providing a model registry and deployment manager for your BentoService. It's a stateful service that can run in your local machine for a personal project, or hosted on a server and shared by a machine learning team.
BentoML-0.3.4
- Add
pip_dependencies
option to@bentoml.env
decorator, and making it the recommended approach for adding PyPI dependencies - Fixed an issue related OpenAPI doc spec with ImageHandler
BentoML Developer Notes
- DEV: added versioneer.py for version management, now using git tags to manage releases
- DEV: Yatai service protobufs and generated interfaces are in the REPO now
BentoML-0.3.1
This is a minor release with mostly bug fixes:
- Added
bentoml config
cli command for configuring local BentoML preferences and configs - Fixed an issue when serving Keras model with API server in docker
- Fixed an issue when dependency missing in docker environment when using ImageHandler
BentoML-0.3.0
-
Fast.ai support, find example notebooks here: https://github.com/bentoml/gallery/tree/master/fast-ai
-
PyTorch support - fixed a number of issues related to PyTorch model serialization and updated example notebook here: https://github.com/bentoml/BentoML/blob/master/examples/pytorch-fashion-mnist/pytorch-fashion-mnist.ipynb
-
Keras Support - fixed a number of issues related to serving Keras model as API server
-
Clipper deployment support - easily deploy BentoML service to Clipper cluster, read more about it here: https://github.com/bentoml/BentoML/blob/master/examples/deploy-with-clipper/deploy-iris-classifier-to-clipper.ipynb
-
ImageHandler improvements - API server's web UI now support posting images to API server for testing API endpoint:
BentoML-0.2.2 Beta
-
Fast.ai support is in beta now, check out the example notebook here: https://colab.research.google.com/github/bentoml/gallery/blob/master/fast-ai/pet-classification/notebook.ipynb
-
Improved OpenAPI docs endpoint:
-
DataframeHandler allows specifying input types now - users can also generate API Client library that respects the expected input format for each BentoML API service user defined, e.g.:
class MyClassifier(BentoService):
@api(DataframeHandler, input_types=['int8', 'int8', 'float', 'str', 'bool'])
def predict(self, df):
...
# or specifying both column name & type:
@api(DataframeHandler, input_types={'id': 'string', 'age': 'int' })
def predict(self, df):
...
- API server index page now provides web UI for testing API endpoints and shows instructions for how to generate Client API library:
BentoML-0.2.1 Beta
- Improved inline python docs and new documentation site launched https://bentoml.readthedocs.io
- Support running examples on google colab
- OpenAPI docs endpoint beta
- Configurable prediction and feedback logging in API server
- Serverless deployment improved
- Bug fixes
BentoML-0.2.0 Beta
New in BentoML version 0.2.0:
- Support for H2O models
- Support for deploying BentoArchive to Amazon SageMaker endpoint
- Support for querying deployment status and delete deployment created with BentoML
- Fixed kubernetes ingress configuration for gunicorn server #136
BentoML-0.1.2 Beta
- Breaking Change: updated the format of config file
bentml.yml
' in the generated archive to include more information. Newer version BentoML won't be able to load archive generated before 0.1.2 - Added support for deploying to serverless platforms, including AWS lambda and google cloud platform functions
- Added REST API index page documentation on available endpoints
BentoML-0.1.1 Beta
- Added Xgboost support
- Added DataframeHandler options includes 'typ', 'input_columns', 'orient'
- BentoML cli command now supporst json string as input, e.g. --input="{"col": {"0": "bc"}}"
- BentoML cli command now supports input file from s3, e.g. --input=s3://my-bucket/test.csv
- Added gunicorn support, serve-gunicorn is now default in generated Docker image
- Added
@ver
decorator for specifying versions with semantic versioning
BentoML-0.0.9 Beta
0.0.9-beta bump version