Skip to main content

Environment variables

Every variable the Bee Flow server reads, grouped by area. Defaults are shown where they exist; bold vars are required for that area to work.

The full list also lives in .env.example in the server repo.

Core / runtime

VariableDefaultPurpose
PUBLIC_URLhttp://localhost:3101The public URL users hit. Used for OAuth redirect, email links.
JWT_SECRET64+ char random string. Signs session JWTs.
PORT3101HTTP port the server listens on.
LOG_LEVELinfotrace / debug / info / warn / error.
NODE_ENVproductionStandard Node convention.
TIMEZONEUTCDefault timezone for cron schedules and date display.
BEEFLOW_DATA_DIR/dataDirectory for any small on-disk state (rare; most state is in Postgres).
BEEFLOW_REQUEST_BODY_LIMIT10mbMax JSON body. Raise for large KB document uploads.

Database

VariableDefaultPurpose
DB_HOSTpostgresPostgres host.
DB_PORT5432Postgres port.
DB_NAMEbeeflowDatabase name.
DB_USERbeeflowDatabase user.
DB_PASSWORDDatabase password.
DB_SSLfalsetrue to require TLS. Use for managed Postgres.
DB_POOL_MAX10Max connections from each server replica.
DB_POOL_IDLE_TIMEOUT_MS30000Close idle connections after this many ms.

Redis

VariableDefaultPurpose
REDIS_URL(none)E.g. redis://redis:6379. Required for >1 server replica.
REDIS_TLSfalseSet to true for rediss:// connections.
REDIS_KEY_PREFIXbf:Prefix to namespace keys. Useful when sharing a Redis instance.

Model providers

Set at least one. Agents pick whichever is configured for their model.

VariableProviderNotes
ANTHROPIC_API_KEYAnthropic ClaudeRecommended default.
ANTHROPIC_BASE_URL(override)For self-hosted Claude proxy.
OPENAI_API_KEYOpenAIGPT-4.x, GPT-5.
OPENAI_BASE_URL(override)OpenAI-compatible endpoints (LiteLLM, vLLM, etc.).
MISTRAL_API_KEYMistralHosted Mistral.
AZURE_OPENAI_ENDPOINTAzure OpenAIE.g. https://my-aoai.openai.azure.com.
AZURE_OPENAI_KEYAzure OpenAIAPI key.
AZURE_OPENAI_API_VERSION2024-08-01-previewAPI version string.
AZURE_OPENAI_DEPLOYMENT_GPT4(deployment name)Maps the GPT-4 model alias to your deployment.
AZURE_OPENAI_DEPLOYMENT_EMBED(deployment name)For Azure-hosted embeddings.
OLLAMA_BASE_URL(none)E.g. http://localhost:11434. Self-hosted local models.
VOXTRAL_API_KEYVoxtralVoice (STT + TTS).
ELEVENLABS_API_KEYElevenLabsTTS, music, sound effects.
DEEPGRAM_API_KEYDeepgramAlternative STT provider.
MISTRAL_TRANSCRIPTION_KEYMistralTranscription pipeline.

The web-search provider is picked in Admin → Integrations → Zoeken. API keys live in the admin DB (encrypted), not in .env. Set the keys in the admin UI:

ProviderWhere to set
Self-hosted Agent Search + SerperAdmin → Integrations → Zoeken (Serper key) + SEARCH_SERVICE_URL env var below
Cloud-only (Serper + provider APIs)Admin → Integrations → Zoeken (Serper key only)
Azure Bing Web SearchAdmin → Integrations → Zoeken (Bing key + market)

See Web search integration for the full setup.

Privacy & DLP

VariableDefaultPurpose
AZURE_PII_ENDPOINT(none)Azure AI Text Analytics endpoint. Falls back to in-process Transformers.js detector + then to optional PII_SERVICE_URL if set.
AZURE_PII_KEY(none)Azure AI Text Analytics key.
PII_SERVICE_URL(none — opt-in)Optional Python sidecar running GLiNER multi PII v1 (Apache 2.0). Only probed when explicitly set; otherwise the in-process Transformers.js detector handles the CPU path.
GUARD_SERVICE_URL(none — opt-in)Optional sidecar health probe for /api/guard/health. Returns not-configured when unset. Moderation now goes via Azure Content Safety natively.
AZURE_CONTENT_SAFETY_ENDPOINT(none)Azure Content Safety for moderation. The default moderation provider when configured.
AZURE_CONTENT_SAFETY_KEY(none)Azure Content Safety key.

Search service (KB)

The KB pipeline can run entirely in-process (chunking + pgvector + RRF + cross-encoder rerank) — no GPU service required. Pick Local under Admin → AI Configuratie → Limits & Self-host. The remote search-service is opt-in via SEARCH_SERVICE_URL.

VariableDefaultPurpose
SEARCH_SERVICE_URL(none — opt-in)External GPU-backed search-service endpoint. When set, KB ingestion / search routes to it for users who pick kb_provider = remote in admin. Leave empty for fully local KB.
RERANKER_URL(none — opt-in)vLLM cross-encoder sidecar. Used as a tier between Azure Cohere and the in-process CPU reranker. Most self-hosted deployments don't need it.
EMBED_API_URL(none — legacy)Self-hosted GPU embedding service (BGE-M3). Memory store falls back to it only when no provider and no CPU embedder are available.

Embeddings

Global embedding model is picked in Admin → AI Configuratie → Embeddings. Used for KB ingestion, KB query, memory store, and as the default embed for web-search inference. The Web Search Inference tab can override per-feature.

The CPU fallback (@huggingface/transformers + Xenova/multilingual-e5-small, MIT, 384-dim) loads automatically when no provider is configured. No env var to set.

Reranking

MethodHow
Azure Cohere rerankerAdmin → AI Configuratie → API Sleutels → Azure Cohere Reranker
CPU cross-encoderAdmin → AI Configuratie → Limits & Self-host → toggle cpu_reranker_enabled. Uses Xenova/bge-reranker-base (MIT, ~280 MB), loaded in-process
Local GPU vLLM sidecarSet RERANKER_URL env var. Skipped when CPU reranker is enabled.
LLM-as-rerankAdmin → AI Configuratie → Web Search Inference → method = "Provider model" + pick a chat model

OAuth providers

For each provider you enable, register an OAuth app with redirect URI https://<your-host>/auth/<provider>/callback.

VariableProviderPurpose
OAUTH_GOOGLE_CLIENT_IDGoogleGmail / Calendar / Drive / Docs / Keep / Contacts / Groups
OAUTH_GOOGLE_CLIENT_SECRETGoogle
OAUTH_MICROSOFT_CLIENT_IDMicrosoftOutlook / MS Calendar / Contacts / OneDrive
OAUTH_MICROSOFT_CLIENT_SECRETMicrosoft
OAUTH_MICROSOFT_TENANTMicrosoftcommon (multi-tenant) or your tenant ID
OAUTH_GITHUB_CLIENT_IDGitHubGitHub tools
OAUTH_GITHUB_CLIENT_SECRETGitHub
OAUTH_NEXTCLOUD_CLIENT_IDNextcloudStandalone NC OAuth (when not using the connector)
OAUTH_NEXTCLOUD_CLIENT_SECRETNextcloud
OAUTH_NEXTCLOUD_BASE_URLNextcloudE.g. https://nc.example.com
OAUTH_LINKEDIN_CLIENT_IDLinkedInLinkedIn read-only
OAUTH_LINKEDIN_CLIENT_SECRETLinkedIn

API-key integrations

VariableIntegration
YOUTRACK_BASE_URLYouTrack — instance URL
YOUTRACK_API_TOKENYouTrack
N8N_BASE_URLn8n — instance URL
N8N_API_KEYn8n
SIGNREQUEST_API_KEYSignRequest
FIREFLIES_API_KEYFireflies
GAMMA_API_KEYGamma

License

VariableDefaultPurpose
BEEFLOW_LICENSE_KEY(empty)JWT licence. Empty = Community tier.
BEEFLOW_LICENSE_PUBLIC_KEY_PATHlicense/bundled-public-key.pemOverride only for testing.
BEEFLOW_LICENSE_REFRESH_URLhttps://license.beeflow.ai/refreshWhere Pro+ keys check for revocation.
BEEFLOW_LICENSE_REFRESH_INTERVAL_HOURS24How often to re-check.

Nextcloud connector pairing

VariableDefaultPurpose
NC_CONNECTOR_HMAC_SECRET(auto-generated per tenant)Shared secret for the /nc/* HMAC reverse proxy. Stored in DB; you usually don't set this manually.
NC_CONNECTOR_BOOTSTRAP_TOKEN_TTL300Seconds the bootstrap handshake token is valid.

Observability

VariableDefaultPurpose
LOG_LEVELinfoSee above.
LOG_FORMATjsonjson for SIEM, pretty for local dev.
OTEL_EXPORTER_OTLP_ENDPOINT(none)OpenTelemetry endpoint for traces / metrics.
OTEL_SERVICE_NAMEbeeflow-serverOTel service name.
METRICS_ENABLEDtrueExposes /metrics (Prometheus format).
METRICS_BASIC_AUTH(none)user:pass to protect /metrics.
SENTRY_DSN(none)Sentry error reporting.
AUDIT_LOG_RETENTION_DAYS90How long guardrail / audit events stay in Postgres.

Feature flags

VariableDefaultPurpose
FEATURE_NOTEBOOKS_ENABLEDtrueNotebooks page (per-user research workspace).
FEATURE_VOICE_ENABLEDtrueVoice features (require Pro+ at runtime).
FEATURE_AGENT_MARKETPLACEtruePublic agent marketplace tab in Studio.
FEATURE_KB_MARKETPLACEtruePublic KB marketplace tab.
FEATURE_SKILLS_ENABLEDtrueSkills system (Pro+ at runtime).
FEATURE_AUTOMATIONS_ENABLEDtrueAutomations (Pro+ at runtime).

These are true by default — set to false to hide the UI even for tiers that would otherwise allow it. Use case: test installs, or compliance lockdowns.

Email (outbound)

VariableDefaultPurpose
SMTP_HOST(none)SMTP server for invitations, password resets, notifications.
SMTP_PORT587
SMTP_USER(none)
SMTP_PASS(none)
SMTP_FROMnoreply@beeflow.aiFrom address.
SMTP_TLStrueSTARTTLS.

Encryption-at-rest

VariableDefaultPurpose
BEEFLOW_ENCRYPTION_KEY(none)32-byte key for encrypting OAuth tokens / API keys at rest in Postgres.

Bee Flow service URL (used by the Nextcloud connector)

VariableDefaultPurpose
BEEFLOW_API_BASE_URLhttps://api.beeflow.aiSet on the connector side via occ app_api:app:setenv. Override only for staging / on-prem.
BEEFLOW_TENANT_KEYautoConnector-side. Literal auto means provision automatically.