llm-anyscale-endpoints

LLM plugin for models hosted by Anyscale Endpoints


License
Apache-2.0
Install
pip install llm-anyscale-endpoints==0.6

Documentation

llm-anyscale-endpoints

PyPI Changelog Tests License

LLM plugin for models hosted by Anyscale Endpoints

Installation

First, install the LLM command-line utility.

Now install this plugin in the same environment as LLM.

llm install llm-anyscale-endpoints

Configuration

You will need an API key from Anyscale Endpoints. You can obtain one here.

You can set that as an environment variable called LLM_ANYSCALE_ENDPOINTS_KEY, or add it to the llm set of saved keys using:

llm keys set anyscale-endpoints
Enter key: <paste key here>

Usage

To list available models, run:

llm models list

You should see a list that looks something like this:

AnyscaleEndpoints: meta-llama/Llama-2-7b-chat-hf
AnyscaleEndpoints: meta-llama/Llama-2-13b-chat-hf
AnyscaleEndpoints: mistralai/Mixtral-8x7B-Instruct-v0.1
AnyscaleEndpoints: mistralai/Mistral-7B-Instruct-v0.1
AnyscaleEndpoints: meta-llama/Llama-2-70b-chat-hf
AnyscaleEndpoints: codellama/CodeLlama-70b-Instruct-hf
AnyscaleEndpoints: mistralai/Mixtral-8x22B-Instruct-v0.1
AnyscaleEndpoints: mlabonne/NeuralHermes-2.5-Mistral-7B
AnyscaleEndpoints: google/gemma-7b-it

To run a prompt against a model, pass its full model ID to the -m option, like this:

llm -m mistralai/Mixtral-8x22B-Instruct-v0.1 \
  'Five strident names for a pet walrus' \
  --system 'You love coming up with creative names for pets'

You can set a shorter alias for a model using the llm aliases command like so:

llm aliases set mix22b mistralai/Mixtral-8x22B-Instruct-v0.1

Now you can prompt Mixtral-8x22B-Instruct-v0.1 using the alias mix22b:

cat llm_anyscale_endpoints.py | \
  llm -m mix22b -s 'explain this code'

You can refresh the list of models by running:

llm anyscale-endpoints refresh

This will fetch the latest list of models from Anyscale Endpoints and story it in a local cache file.

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd llm-anyscale-endpoints
python3 -m venv venv
source venv/bin/activate

Now install the dependencies and test dependencies:

pip install -e '.[test]'

To run the tests:

pytest