Wink Pings

Hermes Agent Integrates Karpathy's LLM-Wiki, Now Automatically Builds Knowledge Bases for You

A magical feature that lets AI read papers, crawl code, and organize materials independently. Enter `/llm-wiki`, and it generates 42 wiki pages and 365 cross-references in three sessions, with GRPO as the core node.

Hermes Agent now comes with the LLM-Wiki feature.

Karpathy's LLM-Wiki was previously used to help AI read papers, crawl code, and organize materials, then automatically build an Obsidian knowledge base. Now Hermes has integrated it — just enter `/llm-wiki`, and it starts working on its own.

How effective is it? Teknium gave it a try, and these are the results from three ingestion sessions:

- 42 wiki pages

- 13 original sources

- 365 cross-references

- 41 unique link targets

![Complex network graph showing multiple nodes and their interconnections](https://wink.run/image?url=https%3A%2F%2Fpbs.twimg.com%2Fmedia%2FHFRn5cAbYAAM8ZE%3Fformat%3Dpng%26name%3Dlarge)

In the image, `[grpo]` is the most connected node with 42 inbound links, followed by `[[nemo-rl]]` (36) and `[[atropos]]` (34). Each page links to at least 2 other pages, with no orphan pages or broken links.

There's also a landscape comparison page `nemo-rl-vs-alternatives` that covers 13 frameworks, divided into three layers (full-stack RL, training/fine-tuning, environments), with an algorithm support matrix and infrastructure stack breakdown. Opening it in Obsidian is like having a map of the entire RL field.

![Terminal output screenshot showing code and command results](https://wink.run/image?url=https%3A%2F%2Fpbs.twimg.com%2Fmedia%2FHFRok_2a8AAYMXw%3Fformat%3Djpg%26name%3Dlarge)

How to use it? It's simple:

```

hermes update

```

Then type `/llm-wiki` in a new message or session, and that's it.

A user named megabyte0x has already set it up for his fitness agent. Previously, he wrote scripts to save daily logs to PSQL and used cron to run weekly reports. Now, with Hermes integrated with LLM-Wiki, the graph generated in one day is denser than what he created manually in a week:

![User's comparison chart](https://wink.run/image?url=https%3A%2Fpbs.twimg.com%2Fmedia%2HFNqqCIa4AADRNM%3Fformat%3Djpg%26name%3Dlarge)

Here's also a video of his previous fitness agent:

A user named Somi AI asked: How does this handle links between notes? Automatically generated Obsidian vaults usually have issues with links.

Teknium said he doesn't know the exact implementation, but the effect looks good and Obsidian can handle it on its own.

There's also an incident: This morning, OpenAI secretly changed the response format of the Responses API, causing many users to encounter issues with Hermes. Teknium stayed up all night fixing it, and it's now working. Just update to the latest version and restart the gateway.

Summary: Who is this for? When you need to research a field — like the differences between various RL frameworks, Hindsight, and Honcho — previously you had to curate a readme.md yourself. Now tell Hermes "hey update your llm-wiki brains" and it will read, organize, and link everything independently.

Don't like it? `/llm-wiki` is a skill; just turn it off if you don't want it.

发布时间: 2026-04-07 15:29