
A practical walkthrough showing how to build an LLM-powered personal knowledge system from YouTube transcripts, web articles and meeting notes. The video demonstrates a fast, reproducible pipeline and compares Karpathy-style LLM wikis to traditional RAG approaches.
– Ingest & organize: use Cloud Code to pull transcripts and sources, auto-chunk content and generate a linked markdown wiki (raw/wiki/index/log).
– Tools & workflow: visualize and edit in Obsidian, use a web clipper to add sources, maintain a hot cache and run linting for consistency and token efficiency.
– Trade-offs & scale: LLM wiki reduces token costs and simplifies infra (just markdown), but traditional semantic-search/RAG remains better for millions of documents and enterprise scale.
– Use cases: builds a YouTube knowledge graph, a personal “second brain,” and context for executive-assistant agents; setup demo completes in minutes.
Quotes:
It finally makes AI feel like a tireless colleague who actually remembers everything.
It’s literally just a folder with markdown files.
Knowledge compounds like interest in a bank.
Statistics
| Upload date: | 2026-04-05 |
|---|---|
| Likes: | 9759 |
| Comments: | 485 |
| Statistics updated: | 2026-04-16 |
Specification: Andrej Karpathy Just 10x’d Everyone’s Claude Code
|