Skip to main content
GoRules AI requires an LLM provider to be configured on your BRMS instance. Your administrator sets this up via environment variables on the server.

Supported LLM providers

ProviderLLM_PROVIDER value
OpenAIopenai
Anthropic (Claude)anthropic
Google (Gemini)google
Amazon Bedrockamazon-bedrock
Azure OpenAIazure-openai

Environment variables

VariableDescriptionDefault
LLM_PROVIDERLLM provider to useRequired
LLM_MODELModel name (e.g., gpt-4o, claude-sonnet-4-20250514)Required
LLM_API_KEYAPI key for the provider (not required for amazon-bedrock)Required
LLM_TEMPERATURESampling temperature0.4
LLM_CONTEXT_WINDOWContext window size in tokensProvider default
LLM_MAX_OUTPUT_TOKENSMaximum tokens per response32000
LLM_THINKING_LEVELExtended thinking level: high, medium, lowmedium
LLM_AZURE_RESOURCE_NAMEAzure OpenAI resource name (required for azure-openai)

Next steps

Once configured, the AI assistant is available to all users on a plan with AI enabled. See AI assistant for usage details.