This is what Attest does

Text goes in. Structured, provenanced claims come out. Every fact carries who said it, how confident we are, and what else corroborates it.

Slack Teams LLM Chat Documents Email Databases External
Attest
Extract · Store · Query · Correct · Discover
MCP Server REST API Dashboard Python SDK NDJSON
What goes in
#eng-decisions · 2024-02-15 · Sarah Chen

“Talked to Platform team — they approved migrating auth from JWT to sessions. I'll own the migration. Timeline is Q2. Redis will handle session storage, API gateway needs to be updated.”
One Slack message. Buried in a channel nobody will search again.
What comes out
sarah chen proposed auth migration
conf 0.85 · source: slack/#eng-decisions

platform team approved auth migration
conf 0.90 · source: slack/#eng-decisions

auth migration depends_on redis
conf 0.80 · source: slack/#eng-decisions

api-gateway affected_by auth migration
conf 0.75 · source: slack/#eng-decisions
4 claims. Each with provenance, confidence, entity types. Queryable. Retractable. Corroborable.

Full Walkthrough

Two ways to get started. Same database underneath.

1

Install

pip install attestdb attest-console
2

Launch the console

attest-console my_company.db

Your browser opens to localhost:8877. You'll see an empty dashboard — that's expected.

3

Connect a data source

Click Sources in the sidebar. Click Connect Slack or Connect Google. Authorize with your workspace. OAuth is handled through Attest's hosted proxy — no API keys, no app registration. Your data flows directly between your machine and Slack/Google.

Don't have Slack or Google? Skip to step 4 — you can paste text directly.

4

Ingest

Click Ingest in the sidebar.

  • Slack: Select “Slack”, optionally filter to specific channels, hit Start Ingestion.
  • Gmail: Select “Gmail”, it pulls the 50 most recent threads.
  • Google Docs: Select “Google Docs”, it fetches up to 50 documents.
  • Paste Text: Select “Text”, paste any content into the text box, hit start.

Try pasting this into the text box to see extraction in action:

Talked to Platform team — they approved migrating auth from JWT to sessions.
I'll own the migration. Timeline is Q2. Redis will handle session storage,
API gateway needs to be updated.

Watch the progress bar. Claims appear as they're extracted.

Extraction ModeAPI Key?Best For
HeuristicNoExplicit statements (“X depends on Y”). Free, runs offline.
SmartYesHeuristic first, LLM only for novel content. Best value.
LLMYesFull LLM extraction. Best for nuanced, implicit relationships.
5

Ask questions

Click Ask in the sidebar. Type a question. Every answer traces back to a specific source — not generated, grounded.

Try these:

Who owns the auth migration?
What depends on Redis?
What are the main projects?
6

Explore

Entities — browse and search all entities in the knowledge graph.
Graph — interactive D3 visualization of entity relationships.
Quality — see knowledge gaps, single-source alerts, health metrics.

1

Install

pip install attestdb
2

Open a database

python3

Then in the Python REPL:

import attestdb

db = attestdb.open("my_company.db")

Creates a new database file if it doesn't exist. Zero config.

3

Ingest text

db.ingest_text(
    "Talked to Platform team — they approved migrating auth from JWT to sessions. "
    "I'll own the migration. Timeline is Q2. Redis will handle session storage, "
    "API gateway needs to be updated.",
    source_id="slack/eng-decisions/2024-02-15",
)

Returns an ExtractionResult with the number of claims extracted.

4

Query

frame = db.query("auth migration", depth=2)
print(frame.narrative)
for rel in frame.direct_relationships:
    print(f"  {rel.target.name} --[{rel.predicate}]--> conf={rel.confidence:.2f}")
5

Ingest from a Slack export

Go to your Slack workspace: Settings → Workspace settings → Import/Export → Export. Download the ZIP, then:

db.ingest_slack("slack_export.zip", extraction="heuristic")

Heuristic extraction is free and runs offline. For deeper extraction:

db.configure_curator("groq")   # free tier available — set GROQ_API_KEY first
db.ingest_slack("slack_export.zip", extraction="smart")
6

Ingest from a ChatGPT export

Go to ChatGPT: Settings → Data controls → Export data. You'll get an email with a ZIP.

results = db.ingest_chat_file("conversations.zip")
print(f"Ingested {sum(r.claims_ingested for r in results)} claims from {len(results)} conversations")
7

Ingest structured claims directly

db.ingest(
    subject=("api-gateway", "service"),
    predicate=("depends_on", "depends_on"),
    object=("redis", "service"),
    provenance={"source_type": "config_management", "source_id": "k8s-manifest-v2.3"},
    confidence=1.0,
)
8

Explore the knowledge base

# List all entities
for e in db.list_entities()[:10]:
    print(f"  {e.name} ({e.entity_type}) — {e.claim_count} claims")

# Quality report
report = db.quality_report()
print(f"Single-source entities: {report.single_source_entity_count}")
print(f"Knowledge gaps: {report.gap_count}")

# Find what's missing
gaps = db.find_gaps()
for gap in gaps[:5]:
    print(f"  {gap}")
9

Retract a bad source

cascade = db.retract_cascade("old-runbook-v1", reason="Outdated procedure")
print(f"Retracted: {cascade.source_retract.retracted_count}")
print(f"Downstream degraded: {cascade.degraded_count}")

Corroborated facts survive. Single-source facts are degraded. Nothing is deleted — full audit trail.

10

Time-travel

import time
yesterday = time.time_ns() - 86_400 * 10**9
snapshot = db.at(yesterday)
frame = snapshot.query("api-gateway", depth=1)
print(frame.narrative)

What did we know yesterday? Last week? Before the bad data came in? Every query is a time-travel query.

What Queries Look Like

Who owns the auth migration?
sarah chen proposed auth migration — conf 0.85 platform team approved auth migration — conf 0.90 sources: slack/#eng-decisions (2024-02-15), eng-all-hands-2024-02
What breaks if Redis goes down?
api-gateway depends_on redis — conf 1.0 (3 sources) session-service depends_on redis — conf 0.90 (2 sources) rate-limiter depends_on redis — conf 0.85 (1 source) sources: k8s-manifest-v2.3, incident-42, postmortem-redis-2024
Any gaps in what we know about payment-service?
payment-service has 3 claims, all from one source Missing predicates: depends_on, owned_by alert: single-source knowledge — no corroboration, at risk if source retracted

Every answer traces back to a specific Slack message, email, or config file. Not generated — grounded in claims with provenance.

Backend Options

db = attestdb.open("my.db")                       # Kuzu (default)
db = attestdb.open("my.db", backend="rust")      # Rust (1.3M claims/sec)
db = attestdb.open("my.db", backend="auto")      # Try Rust, fall back to Kuzu

Both backends have the same API and produce identical results. The Rust backend is faster for bulk ingestion and large datasets.

LLM Providers

Heuristic extraction needs no API key. For LLM-powered extraction, set one environment variable:

export GROQ_API_KEY="your-key-here"

Then configure:

db.configure_curator("groq")
ProviderEnv VariableFree Tier?
GroqGROQ_API_KEYYes
DeepSeekDEEPSEEK_API_KEYYes (limited)
GrokGROK_API_KEYYes (limited)
OpenAIOPENAI_API_KEYNo
AnthropicANTHROPIC_API_KEYNo
OpenRouterOPENROUTER_API_KEYSome models
GLMGLM_API_KEYYes
Advanced: Self-host OAuth credentials

By default, the console routes OAuth through Attest's hosted proxy (auth.attestdb.com). The proxy holds the OAuth app credentials; your data flows directly between your machine and Slack/Google — the proxy never sees your messages or documents, only the OAuth token exchange.

If you prefer to use your own OAuth apps, set environment variables before launching:

export SLACK_CLIENT_ID="your-slack-client-id"
export SLACK_CLIENT_SECRET="your-slack-client-secret"
export GOOGLE_CLIENT_ID="your-google-client-id"
export GOOGLE_CLIENT_SECRET="your-google-client-secret"
$ attest-console my_company.db

Create a Slack app at api.slack.com/apps with scopes: channels:history, channels:read, users:read. Create Google OAuth credentials at Google Cloud Console with Gmail, Drive, and Docs APIs enabled. Set redirect URIs to http://localhost:8877/auth/slack/callback and http://localhost:8877/auth/google/callback.

Next Steps

Cookbooks

Full working examples: org knowledge, biomedical research, DevOps, and ML.

API Reference

Every method, parameter, and return type.

Core Concepts

What claim-native means and why provenance changes everything.

Agent Integration

Connect AI agents via MCP server or REST API. Event hooks for reactive pipelines.