OpenSRE’s official deployment path is LangGraph Platform.Documentation Index
Fetch the complete documentation index at: https://opensre.com/docs/llms.txt
Use this file to discover all available pages before exploring further.
Deploy on LangGraph Platform (official)
Create a LangGraph deployment
In LangGraph Platform, create a new deployment and connect this repository.
Use the repo's graph config
Keep
langgraph.json at the repository root so LangGraph can load OpenSRE’s
graph and HTTP entrypoints.Configure your model provider
Add
LLM_PROVIDER as an environment variable (for example anthropic,
openai, openrouter, or gemini).Add the matching provider API key
Use the API key that matches your provider:
ANTHROPIC_API_KEYforLLM_PROVIDER=anthropicOPENAI_API_KEYforLLM_PROVIDER=openaiOPENROUTER_API_KEYforLLM_PROVIDER=openrouterGEMINI_API_KEYforLLM_PROVIDER=gemini
.env.example.
Railway deployment (self-hosted alternative)
Railway is still supported if you prefer self-hosting. Before runningopensre deploy railway, make sure your Railway service has:
DATABASE_URIpointing to your Railway Postgres instanceREDIS_URIpointing to your Railway Redis instance
Tracer