Personal wikis usually fail at the same step. Capture is easy. Maintenance is not.

The notes pile up faster than they can be linked, summarized, or reconciled. Cross-references go stale. Old assumptions sit next to new ones without any flag that they disagree. After a few months the wiki becomes a folder of drafts that nobody trusts.

The pattern that has worked for me is to split the system into three layers, each with a different owner, and to let an AI agent do the bookkeeping that humans abandon.

The three layers

Sources. Raw inputs: clipped articles, exports, transcripts, meeting notes, screenshots, anything captured from the outside. This layer is read-only. The agent never modifies it. New sources arrive here and stay here.

Wiki. Curated notes derived from the sources. Entity pages, concept pages, summaries, cross-references. This is where synthesis happens. The agent owns most of the writing here, following rules defined in the schema.

Schema. A small set of files describing how the wiki is organized: directory conventions, page formats, naming rules, ingest workflow. This is what turns the agent into a disciplined wiki maintainer instead of a generic chatbot. It evolves over time as the wiki grows.

The point of the split is that each layer has a different rate of change, a different owner, and a different risk profile. Confusing them is what makes most personal knowledge systems collapse.

Three operations

Most of the day-to-day work in a wiki of this shape reduces to three operations:

Ingest. A new source arrives. The agent reads it, summarizes the takeaways, writes a summary page, updates the index, and edits any existing pages whose claims are affected by the new material. Contradictions are flagged inline rather than silently overwritten.

Query. A question is asked. The agent reads the index first, drills into the relevant pages, and answers with citations to specific notes. Useful answers can be filed back into the wiki as new synthesis pages, so the insight is not lost in chat history.

Lint. Periodically, the agent runs a health check: orphan pages, stale claims, missing concepts, unresolved contradictions, places where the index is out of sync with the actual files. This is the maintenance pass that humans tend not to do consistently.

Why this pattern survives

The reason wikis usually rot is that the cost of maintenance grows faster than the value of any single new note. An agent does not get bored. It does not forget the rules. It can edit fifteen files in a single pass without losing focus.

The agent is not making the knowledge. The human still curates sources, asks better questions, and decides what matters. The agent handles linking, summarizing, and consistency — the parts that compound into entropy when left undone.

What the schema actually contains

The schema is the part that turns a folder of markdown into a maintained system. A workable schema usually defines:

  • Where raw sources live, and that they are immutable.
  • Where wiki pages live, and how they are named.
  • The shape of an entity page versus a concept page versus a summary page.
  • How to format frontmatter and cross-references.
  • What happens during ingest, query, and lint.
  • What is sensitive and must never be summarized or published.

It does not need to be long. It needs to be clear enough that an agent reading it cold can do the right thing without guessing.

A boundary, not a workflow

The most important property of this pattern is the boundary, not the automation. Sources stay raw. The wiki stays curated. The schema stays minimal. The agent stays inside the rules.

That separation is what makes the system safe to grow. Adding a new agent, a new ingest source, or a new publishing layer becomes a question about which folder it touches and which rules it follows — not a question about whether the whole system will quietly rewrite itself overnight.