Everruns
Agent control plane

The durable runtime for long-running AI agents.

Everruns combines durable execution, a clear system shape, integration surfaces, and an operator console for teams running long-lived AI workflows.

Platform shape Harnesses / Agents / Apps
Execution Persisted in PostgreSQL
Interfaces API / SDK / CLI / UI
Architecture

A system shape teams can reason about.

State, execution, and interfaces are separated cleanly so teams can understand how the platform behaves under real load.

Everruns platform overview diagram
Control plane

REST API, agent definitions, session lifecycle, secrets, and event fan-out.

Workers

Stateless executors running the reason to act loop from persisted state.

Storage

PostgreSQL stores workflow state, events, configuration, and encrypted secrets.

Interfaces

Use the API, SDKs, CLI, or the management UI depending on the team and task.

Product surface

Built as a platform, not just a runtime.

Everruns spans harnesses, agents, skills, capabilities, MCP servers, apps, and the operator console around them.

Harnesses

Reusable durable execution units with their own dedicated surface in the product.

Agents

Configurable AI workers with optional model overrides, capabilities, and markdown prompts.

Skills

Instruction packages discovered from the workspace filesystem and activated per session.

Capabilities

A registry of tools and behaviors spanning execution, browser, network, storage, UI, and session control.

MCP Servers

Dedicated surface for external model-context providers and tool bridges.

Apps

Deployment layer that connects agents to channels such as Slack.

Integrations

Integrations belong in the core story.

Execution sandboxes, evaluation tooling, model interfaces, and MCP bridges make Everruns useful inside real stacks. This matters more than local setup.

Execution

Daytona

Use isolated cloud sandboxes for code execution without changing the durable runtime model.

Evaluation

Braintrust

Connect observability and evaluation workflows to agent traces and real runs.

Model interface

Open Responses

Use a vendor-neutral model layer instead of rewriting integrations for every provider.

Tool bridge

MCP servers

Attach external tools and context providers through a platform-level integration surface.

Local setup

Try the stack locally in a few minutes.

Docker Compose is useful for evaluation and onboarding. Production deployments still depend on your own topology, providers, and operational model.

  1. Download the published Docker Compose example.
  2. Start the control plane, workers, UI, and database with local secrets configured.
  3. Create an agent, open a session, and stream events back to the client.

A good way to understand the stack quickly before wiring it into a larger environment.

Quick try locally
API shape

For most teams, the faster signal is the API: create an agent, start a session, and stream events.

curl -X POST http://localhost:9300/api/v1/agents \
  -H "Content-Type: application/json" \
  -d '{"name":"Assistant","system_prompt":"You are helpful."}'

curl -X POST http://localhost:9300/api/v1/sessions \
  -H "Content-Type: application/json" \
  -d '{"agent_id":"{agent_id}"}'

curl -N http://localhost:9300/api/v1/sessions/{session_id}/events
Reliability

Durability keeps long-running work resumable.

Durability matters because long-running work stays observable and resumable when infrastructure moves underneath it.

Service restart

Sessions resume from stored workflow state instead of replaying from scratch.

Worker loss

Execution continues because workers are stateless and progress already lives in PostgreSQL.

Long tool run

Event history and in-flight state remain observable over extended tasks.

Everruns operator dashboard overview
Everruns agents screen with demo agents
Everruns capabilities registry screen
Operator console

Operate the platform with the same clarity you build on it.

Create agents with prompts and capabilities, then manage providers, API keys, members, and connections from the same system.

Create agent
Name and descriptionOptional model overrideCapability attachmentMarkdown system prompt editor

LLM Providers

Configure providers first, then manage the models available to agents.

API Keys

Programmatic access is managed inside the operator console, not hidden in separate tooling.

Members

Membership management lives in settings alongside providers, API keys, and connections.

Connections

Personal auth and integration surfaces are part of the real operator workflow.

Open source

Open source core, built for real systems.

MIT licensed and built in Rust. Start with the repo, then follow the docs into architecture, API, and operations.

github.com/everruns/everruns