AGI House — LLM Wiki Event
Public-editable collaborative wiki for the AGI House LLM Wiki event. Anyone can edit any page here — no account needed. Add notes, links, tips, your own wiki, whatever's useful.
If you're at the event and you've never shipped a wiki before, this page has everything you need to get to your first running wiki fast.
What is an "LLM wiki"?
A personal Wikipedia compiled by an LLM from your own life data — journals, notes, messages, tweets, bookmarks. Not a RAG index; a pre-compiled, hand-editable, plain-markdown knowledge base that any coding agent (Claude Code, Codex, Gemini CLI) can navigate via wikilinks.
Three-layer architecture (Karpathy):
- Raw — original sources, untouched (
raw/ordata/) - Wiki — LLM-compiled markdown articles with
[[wikilinks]](wiki/) - Schema — the
SKILL.md/CLAUDE.mdthat instructs the LLM how to write and organize
Karpathy's pithy framing: "Obsidian is the IDE, the LLM is the programmer, the wiki is the codebase."
The tweets that made it viral
- Andrej Karpathy — original LLM Wiki post (Nov 2025) — 12M+ views, kicked the whole thing off: https://x.com/karpathy/status/2040572272944324650
- Karpathy's gist with the skill/prompt (2,100+ stars in 12 hours): https://gist.github.com/karpathy/442a6bf555914893e9891c11519de94f
- Farza Majeed — "Farzapedia" (the first widely-shared implementation, the reply that made Karpathy's idea concrete for most people): https://x.com/FarzaTV/status/2040563939797504467
- Farza's published SKILL.md gist — the file you drop into
.claude/skills/wiki/SKILL.mdto get started: https://gist.github.com/farzaa/c35ac0cfbeb957788650e36aabea836d - Farza's landing page (interest gauge): https://farza.com/knowledge
Write-ups worth reading
- Analytics Vidhya — technical breakdown: https://www.analyticsvidhya.com/blog/2026/04/llm-wiki-by-andrej-karpathy/
- MindStudio — step-by-step with Claude Code: https://www.mindstudio.ai/blog/andrej-karpathy-llm-wiki-knowledge-base-claude-code
- Toolmesh deep dive: https://www.toolmesh.ai/news/karpathy-llm-wiki-personal-knowledge-management
- Medium — what is an LLM Wiki?: https://medium.com/@aristojeff/what-is-an-llm-wiki-and-why-are-people-paying-attention-to-it-b7e10617967d
Ingesting your iMessage history
All your iMessages live in a single SQLite database on every Mac:
~/Library/Messages/chat.db
Requires Full Disk Access for the terminal / app that opens it (System Settings → Privacy & Security → Full Disk Access).
Tables worth knowing:
| Table | What's in it |
|---|---|
message |
Every message — text, attributedBody (emoji/rich text), date (Apple epoch), is_from_me, handle_id |
handle |
Phone numbers and emails of everyone you've ever texted |
chat |
Conversations (1:1 or group) |
chat_message_join |
Maps messages → chats |
Gotchas:
dateis in Apple epoch (nanoseconds since 2001-01-01 UTC). Convert withdatetime(message.date/1e9 + 978307200, 'unixepoch')in SQL.- Modern messages store text in
attributedBody(typedstream-encoded blob), NOTtext— you have to decode the blob. The open-source scripts below already handle this. - Resolve phone numbers to names by joining against Contacts
(
~/Library/Application Support/AddressBook/) — also requires FDA.
Skills / scripts that do it for you
The skill file most people use at these events is Farza's wiki skill:
https://gist.github.com/farzaa/c35ac0cfbeb957788650e36aabea836d
— drop it into .claude/skills/wiki/SKILL.md in an empty project, put
your exports in data/, then run /wiki ingest and /wiki absorb all.
Open-source iMessage ingesters you can crib from:
kothari-nikunj/llm-wiki— most complete. Has a dedicatedingest_imessage.pythat readschat.dbdirectly (bothtextandattributedBody), resolves handles to contact names via Address Book, and groups by contact + day. Also supports WhatsApp, Apple Notes, tweets, bookmarks. Ships a Next.js viewer. → https://github.com/kothari-nikunj/llm-wikiSamurAIGPT/llm-wiki-agent— drop-in agent, works with Claude Code, Codex, or Gemini CLI. Adds NetworkX + Louvain community detection for knowledge-graph visualization. → https://github.com/SamurAIGPT/llm-wiki-agentussumant/llm-wiki-compiler— Claude Code plugin focused on the compile step only. → https://github.com/ussumant/llm-wiki-compilernashsu/llm_wiki— cross-platform desktop app wrapper. → https://github.com/nashsu/llm_wiki
Minimal first run
mkdir mypedia && cd mypedia
mkdir -p .claude/skills/wiki
curl -sL https://gist.githubusercontent.com/farzaa/c35ac0cfbeb957788650e36aabea836d/raw \
> .claude/skills/wiki/SKILL.md
mkdir data
# put an export in data/ — an iMessage SQLite copy, Day One JSON, Apple Notes HTML, whatever
claude
# > /wiki ingest
# > /wiki absorb all
# > /wiki query "who do I talk to about X?"
Farzapedia by the numbers (reference)
| Total raw entries ingested | 2,500 |
| Compiled wiki articles | 400 |
| Sources | Day One diary, Apple Notes, iMessage |
| Architecture | Karpathy's 3-layer (raw → wiki → schema) |
| Agent used to query | Claude Code |
| Storage | Plain markdown files — no vector DB, no embeddings |
| Most surprising output | An auto-generated "Relationship History" article (why Farza won't make Farzapedia public) |
Add your own page
This wiki is public-edit. Anyone can:
- Edit this page — click Edit at the top and save.
- Create a new page — go to
/@jacobcole/agi-house-llmwiki/neworPOSTto/api/v1/wikis/jacobcole/agi-house-llmwiki/pages(no auth required — anonymous pages are marked claimable). - Link to your own LLM wiki — drop it under Attendee wikis below.
Pages you might want to add:
- Your attempt, with what worked / what broke
- A better ingester for a specific source (WhatsApp, Signal, Slack, Email)
- Tips for the
absorb/cleanupphase - Prompts you found useful
- Trip reports on different models (Opus vs. Sonnet vs. GPT vs. Gemini)
Attendee wikis
Add yours! [[name]] — one-line description — URL
- (add your wiki here)
FAQ
Do I need to pay for anything?
No. wikihub.md is free. Claude Code free tier is enough to run a small
absorb. Farza used Claude.
Will my private data leak? Depends entirely on where you run the compile. Karpathy's #2 principle is local — the files stay on your computer. If you point Claude Code at a local directory, inference happens via the API but the files never leave your machine except in prompts. If you're compiling anything sensitive, audit the SKILL.md first and keep the wiki directory out of any cloud sync.
How big does it get? Farza: 2,500 entries → 400 articles, a few MB of markdown. iMessage-only ingests can be 10K–100K messages; they compress down to dozens of articles because most chats are banal.
Can I push my wiki back to wikihub.md?
Yes — git clone https://USERNAME:[email protected]/@USERNAME/SLUG.git
and push. Or use the REST pages API. See /AGENTS.md on this host for the
full agent setup recipe.
Related / background
- WikiHub home: https://wikihub.md
- WikiHub agent setup: https://wikihub.md/AGENTS.md
- Karpathy's broader take on personal AI: https://x.com/karpathy/status/2040572272944324650
- Ankur Warikoo's "YBAAS" response (you-are-a-service, one of the better commentaries on why LLM wikis are actually a big deal): https://x.com/warikoo/status/2040595110635901255
- IQ Source — "knowledge compounds or rots" (the case for curation): https://www.iqsource.ai/en/blog/karpathy-llm-wiki-knowledge-compounds-or-rots/
This page is public-editable. Hosted on WikiHub. Created for the AGI House LLM Wiki event, 2026-04-22.
Connect your MCP client to WikiHub
Want Claude Desktop, Claude Code, or ChatGPT Deep Research to read and write your WikiHub pages with your own API key? Follow the step-by-step setup guide:
- WikiHub MCP Connector — Setup — install, configure, deploy
Short version: build mcp-server/ in the wikihub repo, paste a two-line JSON block into your Claude config, and wikihub_search / wikihub_create_page / 15 other tools become available inside your agent.