llm-openrouter

LLM plugin for models hosted by OpenRouter


License
Apache-2.0
Install
pip install llm-openrouter==0.2

Documentation

llm-openrouter

PyPI Changelog Tests License

LLM plugin for models hosted by OpenRouter

Installation

First, install the LLM command-line utility.

Now install this plugin in the same environment as LLM.

llm install llm-openrouter

Configuration

You will need an API key from OpenRouter. You can obtain one here.

You can set that as an environment variable called OPENROUTER_KEY, or add it to the llm set of saved keys using:

llm keys set openrouter
Enter key: <paste key here>

Usage

To list available models, run:

llm models list

You should see a list that looks something like this:

OpenRouter: openrouter/openai/gpt-3.5-turbo
OpenRouter: openrouter/anthropic/claude-2
OpenRouter: openrouter/meta-llama/llama-2-70b-chat
...

To run a prompt against a model, pass its full model ID to the -m option, like this:

llm -m openrouter/anthropic/claude-2 "Five spooky names for a pet tarantula"

You can set a shorter alias for a model using the llm aliases command like so:

llm aliases set claude openrouter/anthropic/claude-2

Now you can prompt Claude using:

cat llm_openrouter.py | llm -m claude -s 'write some pytest tests for this'

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd llm-openrouter
python3 -m venv venv
source venv/bin/activate

Now install the dependencies and test dependencies:

pip install -e '.[test]'

To run the tests:

pytest