Supported LLM providers
| Provider | LLM_PROVIDER value |
|---|---|
| OpenAI | openai |
| Anthropic (Claude) | anthropic |
| Google (Gemini) | google |
| Amazon Bedrock | amazon-bedrock |
| Azure OpenAI | azure-openai |
Environment variables
| Variable | Description | Default |
|---|---|---|
LLM_PROVIDER | LLM provider to use | Required |
LLM_MODEL | Model name (e.g., gpt-4o, claude-sonnet-4-20250514) | Required |
LLM_API_KEY | API key for the provider (not required for amazon-bedrock) | Required |
LLM_TEMPERATURE | Sampling temperature | 0.4 |
LLM_CONTEXT_WINDOW | Context window size in tokens | Provider default |
LLM_MAX_OUTPUT_TOKENS | Maximum tokens per response | 32000 |
LLM_THINKING_LEVEL | Extended thinking level: high, medium, low | medium |
LLM_AZURE_RESOURCE_NAME | Azure OpenAI resource name (required for azure-openai) | — |