LLM configuration
Configure LLM providers in the application config, not per-pipeline. Pipelines reference provider profiles by name and specify the model per adapter.
Basic shape
Section titled “Basic shape”In backend/config/app.yaml (or the equivalent for your deployment):
llm: providers: - name: "default" provider_type: "openai" model: "gpt-4o" api_key: "${OPENAI_API_KEY}"
- name: "cheap" provider_type: "anthropic" model: "claude-haiku-4-5-20251001" api_key: "${ANTHROPIC_API_KEY}"
- name: "embed" provider_type: "openai" model: "text-embedding-3-small" api_key: "${OPENAI_API_KEY}"Environment variables are substituted at load time.
Provider types
Section titled “Provider types”provider_type | What it constructs | Required fields |
|---|---|---|
openai | OpenAIClient | api_key |
azure | AzureOpenAIClient | api_key, azure_endpoint, api_version |
anthropic | AnthropicClient | api_key |
bedrock | BedrockEmbeddingClient (embedding only) | AWS creds via boto3 default chain |
huggingface | HuggingFaceEmbeddingClient (embedding only, local) | Model name only; no remote creds |
Pipeline references
Section titled “Pipeline references”An adapter references a provider by profile name, not by provider type:
- type: "llm_translator" config: provider: "cheap" model: "claude-haiku-4-5-20251001" # override the profile's default, if needed max_tokens: 4096 system_prompt: "..."Advantage: swap "cheap" from Anthropic → Azure OpenAI by editing app config, no pipeline change.
Verifying credentials
Section titled “Verifying credentials”After startup:
factflow system health --debugThe llm component lists each provider profile with its state. A failed provider (bad credentials, missing optional dep) shows status: degraded with the error.
Quick manual test:
factflow system health # terse — pass/fail onlyEmbedding vs. completion
Section titled “Embedding vs. completion”Two protocols, two factory methods:
factory.create_completion_client(provider_name)— returnsLLMClientProtocol(chat/completion)factory.create_embedding_client(provider_name)— returnsEmbeddingClientProtocol
Some providers support only one. The factory raises at construction time if you ask for a type the provider doesn’t support — fail fast.
Credentials via env
Section titled “Credentials via env”Recommended pattern: set creds in env, reference them in YAML with ${VAR} interpolation.
export OPENAI_API_KEY=sk-...export ANTHROPIC_API_KEY=sk-ant-...
cd backend && uv run python -m factflow_server serve --embeddedNever commit credentials to YAML.
Optional dependencies
Section titled “Optional dependencies”Anthropic, Bedrock, and HuggingFace providers are gated by optional imports. If the library isn’t installed, the factory silently skips the provider (logged at INFO level). ANTHROPIC_AVAILABLE, BEDROCK_AVAILABLE, SENTENCE_TRANSFORMERS_AVAILABLE are the runtime flags.
To install all LLM deps:
cd backend && uv sync --extra all-llmRelated
Section titled “Related”- factflow-llm reference — every public export, error taxonomy
- Concept: LLM clients — why this abstraction exists
- Rate limiting — how LLM calls are governed