Open Source · BSL 1.1 · Built with Rust

Your organization's
living knowledge graph.

DocBrain captures knowledge from PRs, Slack threads, CI pipelines, and your IDE the moment it's created — connects it into a living graph, quality-scores it, and publishes it automatically.

25MB binary <500ms cold start 14 LLM providers 13+ sources
Scroll
0
MB binary size
<500ms
Cold start time
0
LLM providers
0
Knowledge sources

Documentation is broken
at the source.

The problem isn't missing docs. It's that knowledge is created in one place and documented in another — if it's documented at all.

Knowledge dies in Slack threads

A senior engineer explains a critical architecture decision in a thread. Two months later, nobody can find it.

PRs are documentation goldmines

Every merged PR contains decisions, caveats, and procedures — but PR descriptions are write-once, read-never.

Quality is nobody's job

Code has linters, tests, and review workflows. Documentation has nothing. No quality scores, no style enforcement, no SLAs.

How It Works

The shift-left
documentation pipeline

Knowledge captured at the source, quality-scored automatically, published through configurable review workflows.

01

Capture

Knowledge fragments auto-extracted from merged PRs, Slack conversations, CI deployments, and IDE annotations.

Knowledge at the source
02

Score

Three-layer quality pipeline: structural analysis, team-defined style rules, and LLM-assessed semantic scoring.

Quality gates for docs
The Moat
03

Compose

Fragments cluster by semantic similarity. When enough accumulate, DocBrain auto-composes them into documentation.

Docs write themselves
04

Govern

Space ownership, SLA policies, multi-stage review workflows, and breach detection.

Accountability built in
What No One Else Does

The complete knowledge lifecycle

Other tools index existing docs. DocBrain captures the knowledge that was never written down and turns it into quality documentation.

Result: Documentation grounded in real engineering work, quality-scored, and governed — without asking anyone to write docs.

Access Everywhere

Meets you where
you work

One knowledge layer, five interfaces. Ask from Slack during an incident, from your IDE while coding, or from the terminal at 2am.

docbrain-cli
# engineering
JC
Jake Chen2:14 PM
Payments 502 after the deploy. Has anyone seen this before?
DB
DocBrain2:14 PM
Runbook: Payments Service Rollback Procedure
1. Check deploy diff: git log --oneline production..main
2. Rollback: kubectl rollout undo deployment/payment-processor
3. Verify: curl -sf https://api.example.com/health/payments
Sources: Payments On-Call Guide, Deploy Runbook v3 · Confidence: 91%
👍 4🏆 2
feat: add idempotency keys to refund endpoint Merged
DB
DocBrain Bot commented 2 minutes ago
3 knowledge fragments extracted
1. Decision: Idempotency keys prevent duplicate refund processing
2. Caveat: Keys expire after 24h; retry must generate new key
3. Procedure: Manual refund override via admin API requires manager approval
Confidence: 92% · Auto-indexed to "Payments" space
payment_service.rs
refund_handler.rs
pub async fn process_refund(req: RefundRequest) {
  // Idempotency check
  if let Some(existing) = cache.get(&req.idempotency_key) {
    return Ok(existing.clone());
  }
  // Process refund...
}

4-Layer Memory System

A brain that
remembers everything

Four memory layers work together so every question makes the system sharper. The 100th query about Kubernetes is faster and better-answered than the first.

~40%

Cache hit rate

<3s

Avg response

25MB

Binary (Rust)

Working Memory Redis

Multi-turn conversation context. Resolves "it", "that service", "the same thing."

Episodic Memory Postgres + OpenSearch

Every Q&A ever asked. Semantic caching, feedback learning. Gets cheaper as it learns.

Semantic Memory Knowledge Graph

Entity relationships via graph traversal. "Who owns the payment service?" — resolved, not searched.

Procedural Memory Learned Rules

Adapts retrieval from feedback. Discovers: "deployment questions → search DevOps first."

Features

Everything you need

Knowledge Fragments

First-class knowledge units with provenance. Auto-captured from PRs, conversations, and CI pipelines.

CI/CD Capture

Merged PRs and deployments auto-analyzed by LLM. Decisions, caveats, procedures extracted as fragments.

Quality Pipeline

Three-layer scoring: structural, style rules, and LLM-assessed semantic quality. Composite score 0-100.

Confidence-Scored Q&A

Hybrid search, intent classification, cited answers with freshness. Low confidence asks questions instead of hallucinating.

Knowledge Graph

Entity relationships with BFS/DFS traversal. Blast radius analysis, expertise routing.

Documentation Autopilot

Clusters unanswered questions, detects gaps, auto-drafts missing documentation.

14 LLM Providers

Anthropic, OpenAI, Bedrock, Ollama (local), Gemini, Azure, and 8 more. Swap via config.

SSO & RBAC

GitHub/GitLab/OIDC SSO. Four roles. Space-level isolation. API keys Argon2-hashed.

Source-System ACL

Mirrors real Confluence / Slack / GitHub / Jira permissions at query time. Restricted page in source = filtered out for users who can't read it. Three modes, side-channel-safe, audit-logged.

Built with Rust

25MB binary. <500ms cold start. Under 100MB memory. Self-hosted, your data stays yours.

Connector SDK

Plug in any knowledge source in any language. 3 endpoints, DocBrain handles the rest.

MCP IDE Capture

10 tools for Claude Code, Cursor, and any MCP editor. Capture decisions at commit time.

13+ Knowledge Sources

Confluence, Slack, Teams, GitHub, GitLab, Jira, PagerDuty, OpsGenie, Zendesk, and more.

Shift-Left In Practice

A day with DocBrain

Knowledge captured where the work happens. Not after. Not in a doc sprint. At the source, in real-time.

1

9:15 AM — PR Merged

Engineer merges a payment refund PR

DocBrain's CI capture extracts 3 knowledge fragments: a decision about retry logic, a caveat about idempotency keys, and a procedure for manual refund overrides. All auto-indexed with 92% confidence.

2

11:30 AM — Slack Thread

Support asks "how do refunds work now?"

DocBrain answers with fragments captured 2 hours ago — plus links to the original PR. The support engineer types /docbrain capture to save additional context.

3

2:00 PM — Auto-Composition

5 fragments cluster into a draft doc

DocBrain detects semantic similarity across PR fragments, Slack capture, and older fragments. Composes "Payment Refund Procedures," scores at 78/100, routes for SME review.

4

3:30 PM — Published

Reviewed, style-checked, published

Quality rules catch 2 violations, auto-fixed. Final score: 91/100. Published to "Payments" space. Total engineer effort: zero.

Traditional: schedule a doc sprint, assign writers, review in 2 weeks, stale in 2 months.

Shift-left: captured at 9:15 AM. Published by 3:30 PM. Stays accurate forever.

FAQ

Common questions

Is DocBrain really free?
Open source under BSL 1.1. Free for production use. Only restriction: you cannot offer DocBrain as a hosted service. Converts to Apache 2.0 at 5k stars or Jan 1, 2028.
Does my data leave my infrastructure?
When self-hosted, docs and queries never leave your infra. Use Ollama for 100% local, air-gapped deployment with zero data egress.
Which LLMs does DocBrain support?
15+ providers: Anthropic (Claude), OpenAI, AWS Bedrock, Vertex AI, Azure OpenAI, Groq, OpenRouter, Together, DeepSeek, Mistral, xAI, Gemini, Cohere, and Ollama. Swap via config.
Does DocBrain learn from feedback?
Yes. Thumbs up/down updates procedural retrieval rules and feeds Autopilot gap detection. Optional learning pipeline fine-tunes embeddings from feedback.
How long does setup take?
Docker Compose: about 5 minutes. Run bash scripts/setup.sh — PostgreSQL, OpenSearch, Redis, migrations, sample docs. Add your LLM API key and go.
What is "shift-left documentation"?
Same idea as shift-left testing: capture knowledge at the source instead of documenting after the fact. Fragments accumulate, cluster, and auto-compose into quality-scored documentation.
How are docs quality-scored?
Three layers: Structural (deterministic), Style (configurable rules per team), and Semantic (LLM-assessed). Composite score 0-100.
Can I integrate with PagerDuty / incident management?
Yes. Native PagerDuty and OpsGenie ingestion. Knowledge Stream fires early warnings during incidents. Webhook subscriptions push events to any tool.

Self-host in 5 minutes

Docker, an API key, and three commands. That's it.

quickstart
$ git clone https://github.com/docbrain-ai/docbrain
$ cd docbrain
$ bash scripts/setup.sh
✓ Created .env with defaults
✓ Started PostgreSQL, OpenSearch, Redis
✓ Ran 43 migrations
✓ Ingested sample docs
✓ Server running at http://localhost:3000
✓ Web UI at http://localhost:3001
Bootstrap admin key: db_sk_boot_...