Skip to main content

Documentation Index

Fetch the complete documentation index at: https://opensre.com/docs/llms.txt

Use this file to discover all available pages before exploring further.

OpenSRE’s official deployment path is LangGraph Platform.

Deploy on LangGraph Platform (official)

1

Create a LangGraph deployment

In LangGraph Platform, create a new deployment and connect this repository.
2

Use the repo's graph config

Keep langgraph.json at the repository root so LangGraph can load OpenSRE’s graph and HTTP entrypoints.
3

Configure your model provider

Add LLM_PROVIDER as an environment variable (for example anthropic, openai, openrouter, or gemini).
4

Add the matching provider API key

Use the API key that matches your provider:
  • ANTHROPIC_API_KEY for LLM_PROVIDER=anthropic
  • OPENAI_API_KEY for LLM_PROVIDER=openai
  • OPENROUTER_API_KEY for LLM_PROVIDER=openrouter
  • GEMINI_API_KEY for LLM_PROVIDER=gemini
5

Add integration env vars and deploy

Add any additional environment variables required by your integrations, then deploy and verify health in LangGraph Platform.
Minimum LLM environment example:
LLM_PROVIDER=anthropic
ANTHROPIC_API_KEY=...
The complete list of provider keys and optional model overrides is documented in .env.example.

Railway deployment (self-hosted alternative)

Railway is still supported if you prefer self-hosting. Before running opensre deploy railway, make sure your Railway service has:
  • DATABASE_URI pointing to your Railway Postgres instance
  • REDIS_URI pointing to your Railway Redis instance
Then deploy:
opensre deploy railway --project <project> --service <service> --yes