torchserve

TorchServe is a tool for serving neural net models for inference


Keywords
TorchServe, PyTorch, Serving, Deep, Learning, Inference, AI, cpu, deep-learning, docker, gpu, kubernetes, machine-learning, metrics, mlops, optimization
License
Apache-2.0
Install
pip install torchserve==0.10.0

Documentation

TorchServe

Nightly build Docker Nightly build Benchmark Nightly Docker Regression Nightly KServe Regression Nightly Kubernetes Regression Nightly

TorchServe is a flexible and easy-to-use tool for serving and scaling PyTorch models in production.

Requires python >= 3.8

curl http://127.0.0.1:8080/predictions/bert -T input.txt

šŸš€ Quick start with TorchServe

# Install dependencies
# cuda is optional
python ./ts_scripts/install_dependencies.py --cuda=cu121

# Latest release
pip install torchserve torch-model-archiver torch-workflow-archiver

# Nightly build
pip install torchserve-nightly torch-model-archiver-nightly torch-workflow-archiver-nightly

šŸš€ Quick start with TorchServe (conda)

# Install dependencies
# cuda is optional
python ./ts_scripts/install_dependencies.py --cuda=cu121

# Latest release
conda install -c pytorch torchserve torch-model-archiver torch-workflow-archiver

# Nightly build
conda install -c pytorch-nightly torchserve torch-model-archiver torch-workflow-archiver

Getting started guide

šŸ³ Quick Start with Docker

# Latest release
docker pull pytorch/torchserve

# Nightly build
docker pull pytorch/torchserve-nightly

Refer to torchserve docker for details.

āš” Why TorchServe

šŸ¤” How does TorchServe work

šŸ† Highlighted Examples

For more examples

šŸ›”ļø TorchServe Security Policy

SECURITY.md

šŸ¤“ Learn More

https://pytorch.org/serve

šŸ«‚ Contributing

We welcome all contributions!

To learn more about how to contribute, see the contributor guide here.

šŸ“° News

šŸ’– All Contributors

Made with contrib.rocks.

āš–ļø Disclaimer

This repository is jointly operated and maintained by Amazon, Meta and a number of individual contributors listed in the CONTRIBUTORS file. For questions directed at Meta, please send an email to opensource@fb.com. For questions directed at Amazon, please send an email to torchserve@amazon.com. For all other questions, please open up an issue in this repository here.

TorchServe acknowledges the Multi Model Server (MMS) project from which it was derived