torchserve-client

Python Client for TorchServe APIs


Keywords
nbdev, jupyter, notebook, python
License
Apache-2.0
Install
pip install torchserve-client==0.0.2

Documentation

TorchServe Python Client

Install

pip install torchserve_client

Usage

Using torchserve_client is a breeze! It has support for both REST APIs and gRPC APIs.

REST Client

To make calls to REST endpoint, simply initialize a TorchServeClientREST object as shown below:

from torchserve_client import TorchServeClientREST

# Initialize the REST TorchServeClient object
ts_client = TorchServeClientREST()
ts_client
TorchServeClient(base_url=http://localhost, management_port=8081, inference_port=8080)

If you wish to customize the base URL, management port, or inference port of your TorchServe server, you can pass them as arguments during initialization:

from torchserve_client import TorchServeClientREST

# Customize the base URL, management port, and inference port
ts_client = TorchServeClientREST(base_url='http://your-torchserve-server.com', 
                             management_port=8081, inference_port=8080)
ts_client
TorchServeClient(base_url=http://your-torchserve-server.com, management_port=8081, inference_port=8080)

gRPC Client

To create a gRPC client, simply create a TorchServeClientGRPC object

from torchserve_client import TorchServeClientGRPC

# Initialize the gRPC TorchServeClient object
ts_client = TorchServeClientGRPC()
ts_client
TorchServeClientGRPC(base_url=localhost, management_port=7071, inference_port=7070)

To customize base URL and default ports, pass them as arguments during initialization

from torchserve_client import TorchServeClientGRPC

# Initialize the gRPC TorchServeClient object
ts_client = TorchServeClientGRPC(base_url='http://your-torchserve-server.com', 
                             management_port=7071, inference_port=7070)
ts_client
TorchServeClientGRPC(base_url=your-torchserve-server.com, management_port=7071, inference_port=7070)

With these intuitive APIs at your disposal, you can harness the full power of the Management and Inference API and take your application to next level. Happy inferencing! 🚀🔥