Skip to content

LLM configuration

Configure LLM providers in the application config, not per-pipeline. Pipelines reference provider profiles by name and specify the model per adapter.

In backend/config/app.yaml (or the equivalent for your deployment):

llm:
providers:
- name: "default"
provider_type: "openai"
model: "gpt-4o"
api_key: "${OPENAI_API_KEY}"
- name: "cheap"
provider_type: "anthropic"
model: "claude-haiku-4-5-20251001"
api_key: "${ANTHROPIC_API_KEY}"
- name: "embed"
provider_type: "openai"
model: "text-embedding-3-small"
api_key: "${OPENAI_API_KEY}"

Environment variables are substituted at load time.

provider_typeWhat it constructsRequired fields
openaiOpenAIClientapi_key
azureAzureOpenAIClientapi_key, azure_endpoint, api_version
anthropicAnthropicClientapi_key
bedrockBedrockEmbeddingClient (embedding only)AWS creds via boto3 default chain
huggingfaceHuggingFaceEmbeddingClient (embedding only, local)Model name only; no remote creds

An adapter references a provider by profile name, not by provider type:

- type: "llm_translator"
config:
provider: "cheap"
model: "claude-haiku-4-5-20251001" # override the profile's default, if needed
max_tokens: 4096
system_prompt: "..."

Advantage: swap "cheap" from Anthropic → Azure OpenAI by editing app config, no pipeline change.

After startup:

Terminal window
factflow system health --debug

The llm component lists each provider profile with its state. A failed provider (bad credentials, missing optional dep) shows status: degraded with the error.

Quick manual test:

Terminal window
factflow system health # terse — pass/fail only

Two protocols, two factory methods:

  • factory.create_completion_client(provider_name) — returns LLMClientProtocol (chat/completion)
  • factory.create_embedding_client(provider_name) — returns EmbeddingClientProtocol

Some providers support only one. The factory raises at construction time if you ask for a type the provider doesn’t support — fail fast.

Recommended pattern: set creds in env, reference them in YAML with ${VAR} interpolation.

Terminal window
export OPENAI_API_KEY=sk-...
export ANTHROPIC_API_KEY=sk-ant-...
cd backend && uv run python -m factflow_server serve --embedded

Never commit credentials to YAML.

Anthropic, Bedrock, and HuggingFace providers are gated by optional imports. If the library isn’t installed, the factory silently skips the provider (logged at INFO level). ANTHROPIC_AVAILABLE, BEDROCK_AVAILABLE, SENTENCE_TRANSFORMERS_AVAILABLE are the runtime flags.

To install all LLM deps:

Terminal window
cd backend && uv sync --extra all-llm