LLM and Authentication Variables
The backend relies on these secrets to talk to model providers, orchestrate researcher/report agents, and enable OAuth flows.
LLM_CONFIGS
- Decide which providers you want to use (OpenAI-compatible, Anthropic, Gemini, etc.).
- For each provider, collect the API key and base URL if the provider requires a custom endpoint.
- Build a JSON array describing each model, e.g.:
[
{
"provider": "openai",
"model": "gpt-4o-mini",
"apiKey": "sk-your-key",
"baseUrl": "https://api.openai.com/v1",
"maxRetries": 3
}
] - Paste the serialized JSON blob into
LLM_CONFIGS(wrap the value in single quotes inside.stack.envso special characters survive).