Key Takeaways
- A knowledge base maintained by an LLM compounds rather than decays: each new piece of information is integrated, cross-referenced, and made searchable automatically.
- The Karpathy LLM Wiki pattern — where the agent maintains and updates a structured knowledge store — is now replicable with a Hetzner VPS and a Telegram bot.
- The value of a compounding knowledge base grows non-linearly: the more it knows, the better it can contextualise new information.
The Problem With Human-Maintained Wikis
Personal wikis maintained by humans tend to rot: they're accurate when created, outdated within months, and eventually abandoned. The curation overhead is too high to sustain alongside actual work.
An LLM-maintained wiki inverts this: the agent handles integration, cross-referencing, and search without requiring the human to actively curate.
“compounds over time instead of rotting”
The Stack
Hetzner VPS (cheap, always-on), Hermes Agent (the knowledge worker), Telegram bot (the access interface), and a static site generator for the public-facing version at wiki.ai-biz.app. The Karpathy LLM Wiki pattern provides the conceptual framework: the agent is the primary curator, not a search tool.
How It Compounds
Each new piece of information added to the wiki gets integrated with existing entries: cross-referenced, tagged, and contextualised against what's already there. Unlike a flat note-taking tool, the agent identifies connections between new and existing knowledge — surfacing relationships the human wouldn't have thought to establish manually.
The wiki at six months is meaningfully more valuable than the wiki at month one. That compounding is the core value proposition.
Story sourced from the official Nous Research Hermes user-stories page. Original author: @Jsong.