Text goes in. Structured, provenanced claims come out. Every fact carries who said it, how confident we are, and what else corroborates it.
Two ways to get started. Same database underneath.
pip install attestdb attest-console
attest-console my_company.db
Your browser opens to localhost:8877.
You'll see an empty dashboard — that's expected.
Click Sources in the sidebar. Click Connect Slack or Connect Google. Authorize with your workspace. OAuth is handled through Attest's hosted proxy — no API keys, no app registration. Your data flows directly between your machine and Slack/Google.
Don't have Slack or Google? Skip to step 4 — you can paste text directly.
Click Ingest in the sidebar.
Try pasting this into the text box to see extraction in action:
Talked to Platform team — they approved migrating auth from JWT to sessions. I'll own the migration. Timeline is Q2. Redis will handle session storage, API gateway needs to be updated.
Watch the progress bar. Claims appear as they're extracted.
| Extraction Mode | API Key? | Best For |
|---|---|---|
Heuristic | No | Explicit statements (“X depends on Y”). Free, runs offline. |
Smart | Yes | Heuristic first, LLM only for novel content. Best value. |
LLM | Yes | Full LLM extraction. Best for nuanced, implicit relationships. |
Click Ask in the sidebar. Type a question. Every answer traces back to a specific source — not generated, grounded.
Try these:
Who owns the auth migration?
What depends on Redis?
What are the main projects?
Entities — browse and search all entities in the knowledge graph.
Graph — interactive D3 visualization of entity relationships.
Quality — see knowledge gaps, single-source alerts, health metrics.
pip install attestdb
python3
Then in the Python REPL:
import attestdb db = attestdb.open("my_company.db")
Creates a new database file if it doesn't exist. Zero config.
db.ingest_text( "Talked to Platform team — they approved migrating auth from JWT to sessions. " "I'll own the migration. Timeline is Q2. Redis will handle session storage, " "API gateway needs to be updated.", source_id="slack/eng-decisions/2024-02-15", )
Returns an ExtractionResult with the number of claims extracted.
frame = db.query("auth migration", depth=2) print(frame.narrative) for rel in frame.direct_relationships: print(f" {rel.target.name} --[{rel.predicate}]--> conf={rel.confidence:.2f}")
Go to your Slack workspace: Settings → Workspace settings → Import/Export → Export. Download the ZIP, then:
db.ingest_slack("slack_export.zip", extraction="heuristic")
Heuristic extraction is free and runs offline. For deeper extraction:
db.configure_curator("groq") # free tier available — set GROQ_API_KEY first db.ingest_slack("slack_export.zip", extraction="smart")
Go to ChatGPT: Settings → Data controls → Export data. You'll get an email with a ZIP.
results = db.ingest_chat_file("conversations.zip") print(f"Ingested {sum(r.claims_ingested for r in results)} claims from {len(results)} conversations")
db.ingest( subject=("api-gateway", "service"), predicate=("depends_on", "depends_on"), object=("redis", "service"), provenance={"source_type": "config_management", "source_id": "k8s-manifest-v2.3"}, confidence=1.0, )
# List all entities for e in db.list_entities()[:10]: print(f" {e.name} ({e.entity_type}) — {e.claim_count} claims") # Quality report report = db.quality_report() print(f"Single-source entities: {report.single_source_entity_count}") print(f"Knowledge gaps: {report.gap_count}") # Find what's missing gaps = db.find_gaps() for gap in gaps[:5]: print(f" {gap}")
cascade = db.retract_cascade("old-runbook-v1", reason="Outdated procedure") print(f"Retracted: {cascade.source_retract.retracted_count}") print(f"Downstream degraded: {cascade.degraded_count}")
Corroborated facts survive. Single-source facts are degraded. Nothing is deleted — full audit trail.
import time yesterday = time.time_ns() - 86_400 * 10**9 snapshot = db.at(yesterday) frame = snapshot.query("api-gateway", depth=1) print(frame.narrative)
What did we know yesterday? Last week? Before the bad data came in? Every query is a time-travel query.
Every answer traces back to a specific Slack message, email, or config file. Not generated — grounded in claims with provenance.
db = attestdb.open("my.db") # Kuzu (default) db = attestdb.open("my.db", backend="rust") # Rust (1.3M claims/sec) db = attestdb.open("my.db", backend="auto") # Try Rust, fall back to Kuzu
Both backends have the same API and produce identical results. The Rust backend is faster for bulk ingestion and large datasets.
Heuristic extraction needs no API key. For LLM-powered extraction, set one environment variable:
export GROQ_API_KEY="your-key-here"
Then configure:
db.configure_curator("groq")
| Provider | Env Variable | Free Tier? |
|---|---|---|
| Groq | GROQ_API_KEY | Yes |
| DeepSeek | DEEPSEEK_API_KEY | Yes (limited) |
| Grok | GROK_API_KEY | Yes (limited) |
| OpenAI | OPENAI_API_KEY | No |
| Anthropic | ANTHROPIC_API_KEY | No |
| OpenRouter | OPENROUTER_API_KEY | Some models |
| GLM | GLM_API_KEY | Yes |
By default, the console routes OAuth through Attest's hosted proxy
(auth.attestdb.com). The proxy holds the OAuth app credentials;
your data flows directly between your machine and Slack/Google — the proxy
never sees your messages or documents, only the OAuth token exchange.
If you prefer to use your own OAuth apps, set environment variables before launching:
export SLACK_CLIENT_ID="your-slack-client-id" export SLACK_CLIENT_SECRET="your-slack-client-secret" export GOOGLE_CLIENT_ID="your-google-client-id" export GOOGLE_CLIENT_SECRET="your-google-client-secret" $ attest-console my_company.db
Create a Slack app at api.slack.com/apps
with scopes: channels:history, channels:read, users:read.
Create Google OAuth credentials at
Google Cloud Console
with Gmail, Drive, and Docs APIs enabled.
Set redirect URIs to http://localhost:8877/auth/slack/callback and
http://localhost:8877/auth/google/callback.
Full working examples: org knowledge, biomedical research, DevOps, and ML.
Every method, parameter, and return type.
What claim-native means and why provenance changes everything.
Connect AI agents via MCP server or REST API. Event hooks for reactive pipelines.