01 — Helicopter View · LLM Wiki Pattern

The LLM Wiki Pattern

Karpathy's pattern replaces RAG retrieval with a persistent, LLM-maintained knowledge base that compounds with every source

3Architecture Layers
3Core Operations
10-15Pages per Ingest
The Problem
RAG Is Stateless
Every query starts from zero. No cross-references built. No contradictions detected. No understanding compounds. It is a librarian that fetches books but never reads them.
No Memory No Synthesis
Core Concept
What Is the LLM Wiki Pattern?
Instead of retrieving from raw documents at query time, the LLM incrementally builds and maintains a persistent wiki of markdown files. Knowledge is compiled once and kept current, not re-derived on every query. When you add a new source, the LLM reads it, extracts key information, and integrates it into existing pages. Cross-references are pre-built. Contradictions are flagged during ingest. Every source makes the wiki richer.
New Source> LLM Processes> Updates Wiki> Knowledge Compounds
Mental Model
IDE + Programmer + Codebase
Obsidian is the IDE. The LLM is the programmer. The wiki is the codebase. You browse the results in real time while the LLM makes edits based on conversation.
RAG = search engine for fragments. Wiki = living research notebook.
Layer 1
Raw Sources
Immutable collection of source documents. Articles, papers, transcripts, images. The LLM reads from them but never modifies. Your evidence locker and source of truth.
Immutable Traceable
Layer 2
The Wiki
LLM-generated markdown: summaries, entity pages, concept pages, comparisons, running synthesis. The LLM owns this layer entirely. Knowledge accumulates here.
LLM-Owned Compounding
Layer 3
The Schema
Config doc (CLAUDE.md) defining structure, conventions, and workflows. The meta-layer and control surface. Human and LLM co-evolve this over time.
Co-Evolved Meta-Layer
Architecture
Human Curator Schema (L3) Raw Sources (L1) Wiki (L2) Ingest Query Lint compounds
Operation 1
Ingest
Drop a new source. LLM reads it, writes a summary, updates the index, revises 10-15 related wiki pages, appends to the log. One-at-a-time with review is preferred.
Source> Process> Update
Operation 2
Query
Ask questions against the wiki. LLM reads the index, finds relevant pages, synthesizes an answer with citations. Good answers file back into the wiki. Explorations compound too.
Cited Filed Back
Operation 3
Lint
Health-check the wiki. Find contradictions, stale claims, orphan pages, missing concepts, data gaps. Suggests new questions and sources to investigate.
Contradictions Orphans
Comparison
RAG vs LLM Wiki
DimensionRAGWiki
KnowledgeStatelessCompounds
Cross-refsQuery-timePre-built
SetupHigherMarkdown only
ScaleMillionsHundreds
ContradictionsMissedFlagged
Anti-Patterns
What Breaks It
Where It Shines
Best Use Cases for the Pattern
Any domain where knowledge accumulates over weeks or months and cross-referencing matters. The LLM handles the maintenance nobody wants to do. Human thinks, LLM bookkeeps.
Research Deep Dives Book Companion Personal Development Team Knowledge Competitive Intel Course Notes
Get Started
5 Steps to Begin
Just a git repo of markdown files. No database. No vector store.
01 — Helicopter · LLM Wiki Pattern · Detail pages: 02 Three-Layer Architecture · 03 Operations & Workflows · 04 RAG vs Wiki Trade-offs LLM Wiki Pattern