API access to Microsoft's Azure OpenAI models
Install this plugin in the same environment as LLM.
llm install llm-azure-openai
Export the environment variables:
export AZURE_TENANT_ID='...'
export AZURE_CLIENT_ID='...'
export AZURE_CLIENT_SECRET='...'
Configure the models you want to use from your deployments
On Mac: ~/Library/Application Support/io.datasette.llm/azure-openai-models.yaml
On Linux: ~/.config/io.datasette.llm/azure-openai-models.yaml
- model_id: o3-mini
model_name: o3-mini
azure_endpoint: https://example.openai.azure.com
api_version: '2024-12-01-preview'
aliases: ['azure-o3-mini']
use_azure_ad: true
- model_id: gpt-35-turbo-blue
model_name: gpt-35-turbo-blue
azure_endpoint: https://example.openai.azure.com
api_version: '2024-02-01'
aliases: ['azure-gpt-35']
use_azure_ad: true
Now run the model using -m your-model
, for example:
llm -m azure-o3-mini "..."
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
cd llm-azure-openai
python3 -m venv venv
source venv/bin/activate
Now install the dependencies and test dependencies:
llm install -e '.[test]'
To run the tests:
pytest
This package is signed using sigstore to provide supply chain security. When you install this package from PyPI, you can verify its authenticity by checking the digital signatures.