Andrej Karpathy, co-founder of OpenAI and former Director of AI at Tesla, recently open-sourced a fascinating concept: building a “Personal Wikipedia” using AI agents. Rather than treating an LLM like a search engine that retrieves chunks of data and immediately forgets them, this system compiles your raw notes, articles, and screenshots into an interlinked, living markdown wiki.
The Problem with Traditional Note-Taking (and RAG)
Most note-taking apps are built for humans to browse. If you use Notion, Obsidian, or Apple Notes, the burden of organizing, tagging, and cross-referencing falls entirely on you. Even when using Retrieval-Augmented Generation (RAG) like NotebookLM or ChatGPT uploads, the AI retrieves context for a single answer, but it doesn’t “learn” or organize your knowledge base permanently.
Karpathy’s LLM Wiki flips this paradigm. It’s optimized for the AI to read and write on your behalf. The knowledge is compiled once, updated incrementally, and gets richer over time. One new source can ripple through 10 to 15 existing pages, flagging contradictions and linking concepts.
How the LLM Wiki Works
The architecture is surprisingly simple and doesn’t require a complex vector database or embedding pipeline. It relies on three core layers:
- Immutable Raw Sources: A directory where you dump raw materials—PDFs, diaries, Apple Notes, iMessage threads, or screenshots.
- Wiki Directory: The folder the AI agent owns and maintains, filled with markdown files representing different entities and concepts.
- Index.md: A catalog file that contains one-line summaries of every page. The agent reads this first to navigate the wiki like a file system.
A great example of this in the wild is Farzapedia, built by Farza on X. He fed 2,500 entries of personal data into an LLM, which spit out 400 interconnected articles. The agent now uses this wiki as a persistent memory bank, surfing through cross-referenced entries to surface inspiration for his landing pages and projects.
Installation Instructions
If you want to set up Karpathy’s LLM Wiki locally, you’ll use Claude Code (Anthropic’s terminal-based coding agent) pointing to a folder of Markdown files (like an Obsidian Vault).
Mac, Windows, and Linux Setup
The setup is identical across all platforms since it relies on Node.js.
- Install Node.js: Ensure you have Node.js installed on your system.
- Create your Vault: Create a folder anywhere on your computer (e.g.,
~/Documents/llm-wiki) to act as your knowledge base. You can use Obsidian to view the markdown files visually. - Install Claude Code: Run the following command in your terminal to install Anthropic’s local CLI agent globally:
npm install -g @anthropic-ai/claude-codeOnce installed, authenticate with your Anthropic account, navigate to your wiki folder, and launch the agent:
cd ~/Documents/llm-wiki
claudeYou can ask the agent to organize your raw notes, create summaries, or find cross-references across your entire vault.
Why This is the Future of Personal Knowledge
This workflow solves the “knowledge rot” problem. Instead of your notes becoming a messy graveyard of forgotten ideas, the AI agent does the boring bookkeeping. It creates pages, links them together, and treats your explorations as compounding knowledge.
Check out the full implementation details here:
