The LLM-native memory pattern: a short field report
A short, dated history of how the 'personal markdown wiki for AI agents' pattern emerged, who shipped what, and why it all converged on the same four properties. Reading this first makes every other AI-memory recipe make more sense.
- Time
- 10 minute read
- Cost
- $0
- Stack
- A browserMaybe a highlighter
You’re stuck with
You keep seeing 'personal wiki for AI agents' posts and can't tell what's actually new versus what's been working for years. You want the shape of the story, not another hot take.
You end up with
A single-page map of the builders, the dates, the naming moments, and the constraints that kept surfacing, enough context to read any future 'AI memory' thread and know exactly where it fits.
The recipe
2022, kepano writes "File over app"
Before any LLM wrote a useful personal note, kepano (Obsidian's CEO) was already arguing that your knowledge should live in files you own, not in apps that own you. The essay was called "File over app." The thesis: apps are temporary, file formats survive.
That line became the philosophical root of every serious AI-memory project that shipped later. Every time someone says "your data in plain markdown beats a database," they are pointing back at this framing, whether they know it or not.
2024. Daniel Miessler ships Fabric and PAI
Miessler's Fabric framework and the /pai (Personal AI) Pulse system took the file-over-app philosophy and added real AI plumbing. Docs, knowledge, learnings, all in plain markdown, all readable by any LLM, all pointed at one concrete outcome: an AI that actually knows you.
Fabric made the pattern reproducible. If you'd seen it in 2024, you already had the blueprint.
Late 2025. Kevin Nguyen open-sources ByteRover
ByteRover was one of the earliest production-looking "structured markdown vault as agent memory" projects, and Nguyen open-sourced it. His line stuck: "Infinite context windows are an endless tax. A living wiki is a compounding asset."
By the time ByteRover shipped, the shape of the pattern was visible to anyone paying attention, but it still did not have a public name.
Early 2026. Farza ships Farzapedia
Farza turned 2,500 diary entries, Apple Notes, and iMessage threads into "Farzapedia", 400 AI-generated articles about the people, projects, and ideas in his life. A personal Wikipedia, owned and local.
Farzapedia was the moment the pattern stopped being a builder's inside joke and became a mainstream-curious product. It was also the direct trigger for the tweet that would name everything.
April 4, 2026. Karpathy names the pattern
Andrej Karpathy quote-tweeted Farzapedia with four properties:
- Explicit, navigable, inspectable memory
- Yours, local, owned, vendor-independent
- File over app, plain files, universal formats
- BYOAI, any agent, any model, interchangeable
The post hit 1.1M views, 8.7K likes, and, most telling, 12,500 bookmarks. A 1.1% save rate is a tell: people weren't just amused, they were saving it to build from later.
Karpathy didn't invent the pattern. He named it publicly, and made the convergence visible to anyone who'd been too busy building to notice the shape.
April – May 2026, the ecosystem answers
Within weeks, the response showed up in code:
- Nikunj Kothari publishes LLMwiki, an open-source "vibe-coded with Opus 4.5" tool that takes bookmarks, messages, and writing and generates a static personal Wikipedia. 80 GitHub stars in a weekend, 30K views on the launch tweet.
- Daniel Miessler announces he's formalizing his PAI
/paiURL path as the public-facing version of the pattern. - Muratcan publishes "The File System Is the New Database: How I Built a Personal OS for AI Agents", the first-principles case for the same constraints.
- The
#pkg(personal knowledge graph) framing shows up in the replies, kidehen and others argue the Semantic Web finally has a personal form.
The pattern now has a name, a bookmark count, a canonical post to link, and half a dozen open-source starting points. It is no longer an inside joke.
What to take from the story
The pattern converged independently. Six unrelated builders, working in different stacks and different years, ended up with the same four constraints. That kind of convergence is a strong signal: the constraints are real, not aesthetic.
Naming makes markets. The pattern existed for years before Karpathy's tweet. The tweet did not invent anything, it made the pattern legible. Once a pattern is legible, the ecosystem catches up in weeks.
File over app is the load-bearing idea. Every other constraint is a consequence of it. If your memory is files, it's inspectable, local, and agent-agnostic by default. If your memory is a feature, none of those things are guaranteed.
You are not late. This story ends in April 2026. The tooling is still nascent, the conventions are still forming, and the best vault you could run today looks very different from the best vault possible in a year. Starting now puts you ahead of most people, still.
Where to go next
If you want to build the pattern rather than read about it, the companion workflow is the recipe:
/workflows/karpathy-wiki-llm-knowledge-base, implement the four principles, end to end, in one afternoon.
If you want to go beyond a wiki and run a full multi-agent personal operating system, the advanced recipe is:
/workflows/personal-llm-operating-system, cron automation, mobile bridge, live data feeds, session protocols, the whole stack.
The field report is the map. The two recipes are the routes.
Get new workflows and breakdowns in your inbox.