llm-watsonx

LLM plugin for IBM watsonx models


License
Apache-2.0
Install
pip install llm-watsonx==0.1.1

Documentation

llm-watsonx

PyPI License

Installation

Install this plugin in the same environment as LLM. From the current directory

llm install llm-watsonx

Configuration

You will need to provide the following:

export WATSONX_API_KEY=
export WATSONX_PROJECT_ID=
  • Optionally, if your watsonx instance is not in us-south:
export WATSONX_URL=

Usage

Get list of commands:

llm watsonx --help

Models

See all available models:

llm watsonx list-models

See all generation options:

llm watsonx list-model-options

Example

llm -m watsonx/meta-llama/llama-3-8b-instruct \
    -o temperature .4 \
    -o max_new_tokens 250 \
    "What is IBM watsonx?"

Chat Example

llm chat -m watsonx/meta-llama/llama-3-8b-instruct \
    -o max_new_tokens 1000 \
    -s "You are an assistant for a CLI (command line interface). Provide and help give unix commands to help users achieve their tasks."

Embeddings

See all available models:

llm watsonx list-embedding-models

Example

cat README.md | llm embed -m watsonx/ibm/slate-30m-english-rtrvr