Give Cursor and Claude Code persistent memory powered by Honcho. Your AI assistant remembers what you're working on, your preferences, and what it was doing -- across context wipes, session restarts, and tab closures.
Claude Code users: See claude-honcho for Claude Code-specific installation. The underlying plugin codebase is shared and host-aware.
| Plugin | Description |
|---|---|
| honcho | Persistent memory for Cursor sessions |
| honcho-dev | Skills for building AI apps with the Honcho SDK |
macOS / Linux:
curl -fsSL https://raw.githubusercontent.com/plastic-labs/cursor-honcho/main/install.sh | bashWindows (PowerShell):
irm https://raw.githubusercontent.com/plastic-labs/cursor-honcho/main/install.ps1 | iexThe installer handles everything: bun, dependencies, global hooks, and MCP config. You just need a free API key.
- Bun -- install with
curl -fsSL https://bun.sh/install | bash - Honcho API key -- free at app.honcho.dev
Add to your shell config (~/.zshrc or ~/.bashrc):
export HONCHO_API_KEY="hch-your-key-here"Reload: source ~/.zshrc
cd /path/to/cursor-honcho/plugins/honcho
bun installIn any project where you want Honcho memory, create a .cursor/ directory with three things:
.cursor/hooks.json -- registers all 9 lifecycle hooks:
{
"version": 1,
"hooks": {
"sessionStart": [
{ "command": "bun run /path/to/cursor-honcho/plugins/honcho/hooks/session-start.ts" }
],
"sessionEnd": [
{ "command": "bun run /path/to/cursor-honcho/plugins/honcho/hooks/session-end.ts" }
],
"beforeSubmitPrompt": [
{ "command": "bun run /path/to/cursor-honcho/plugins/honcho/hooks/before-submit-prompt.ts" }
],
"postToolUse": [
{
"command": "bun run /path/to/cursor-honcho/plugins/honcho/hooks/post-tool-use.ts",
"matcher": "Write|Edit|Shell|Task|MCP"
}
],
"preCompact": [
{ "command": "bun run /path/to/cursor-honcho/plugins/honcho/hooks/pre-compact.ts" }
],
"stop": [
{ "command": "bun run /path/to/cursor-honcho/plugins/honcho/hooks/stop.ts" }
],
"subagentStop": [
{ "command": "bun run /path/to/cursor-honcho/plugins/honcho/hooks/subagent-stop.ts" }
],
"afterAgentThought": [
{ "command": "bun run /path/to/cursor-honcho/plugins/honcho/hooks/after-agent-thought.ts" }
],
"afterAgentResponse": [
{ "command": "bun run /path/to/cursor-honcho/plugins/honcho/hooks/after-agent-response.ts" }
]
}
}Replace /path/to/cursor-honcho with the actual absolute path to this repo on your machine.
.cursor/mcp.json -- connects the MCP server for mid-conversation memory tools:
{
"mcpServers": {
"honcho": {
"command": "bun",
"args": ["run", "/path/to/cursor-honcho/plugins/honcho/mcp-server.ts"]
}
}
}.cursor/rules/honcho-memory.mdc -- the always-on memory rule:
cp /path/to/cursor-honcho/plugins/honcho/rules/honcho-memory.mdc .cursor/rules/Launch Cursor from a terminal so it inherits HONCHO_API_KEY:
cursor /path/to/your-projectOpen a new chat. If everything is configured, Honcho will inject memory context at session start.
- Open a new chat and check if the AI knows your name/preferences
- Run
/honcho:statusto see the connection status - Run
/honcho:setupfor guided configuration if something isn't working
Context about you, your preferences, and past work loads automatically at the start of every session. No re-explaining.
When Cursor's subagents complete work, their results are saved to Honcho. Delegated research, code analysis, and background tasks all contribute to persistent memory.
The afterAgentThought hook captures substantial AI reasoning (extended thinking blocks). When the AI does deep analysis, the insights are preserved for future sessions.
| Tool | Description |
|---|---|
search |
Semantic search across session messages |
chat |
Query Honcho's knowledge about the user |
create_conclusion |
Save insights about the user to memory |
| Command | Description |
|---|---|
/recall [topic] |
Search memory for something specific |
/remember [fact] |
Save something to persistent memory |
/honcho:interview |
Interview to capture preferences |
/honcho:status |
Show memory system status |
/honcho:setup |
Guided first-time configuration |
| Hook | Purpose |
|---|---|
sessionStart |
Load context from Honcho |
sessionEnd |
Upload messages and generate summary |
beforeSubmitPrompt |
Save messages, retrieve relevant context |
postToolUse |
Log tool activity to Honcho |
preCompact |
Anchor memory before context compaction |
stop |
Capture meaningful assistant responses |
subagentStop |
Save subagent results to memory |
afterAgentThought |
Capture deep reasoning |
afterAgentResponse |
Save assistant prose responses |
Honcho stores persistent config at ~/.honcho/config.json. This is created automatically on first run, or you can edit it manually:
{
"apiKey": "hch-your-key-here",
"peerName": "eri",
"hosts": {
"cursor": {
"workspace": "cursor",
"aiPeer": "cursor"
},
"claude_code": {
"workspace": "claude_code",
"aiPeer": "claude"
},
"obsidian": {
"workspace": "obsidian"
}
}
}Each host gets its own workspace and aiPeer identity. The peerName (your identity) is shared across hosts. All three plugins -- cursor-honcho, claude-honcho, and obsidian-honcho -- read the same config file without conflicts. Obsidian uses a single-peer model (no aiPeer) since your notes are the messages, not a conversation with an AI.
| Variable | Required | Default | Description |
|---|---|---|---|
HONCHO_API_KEY |
Yes | -- | Your Honcho API key from app.honcho.dev |
HONCHO_PEER_NAME |
No | $USER |
Your identity in the memory system |
HONCHO_WORKSPACE |
No | auto-detected | Workspace name (auto-detected from host) |
HONCHO_AI_PEER |
No | auto-detected | How the AI is identified (auto-detected from host) |
HONCHO_ENDPOINT |
No | production |
production, local, or a custom URL |
HONCHO_ENABLED |
No | true |
Set to false to disable |
HONCHO_SAVE_MESSAGES |
No | true |
Set to false to stop saving messages |
HONCHO_LOGGING |
No | true |
Set to false to disable file logging |
- Check your API key:
echo $HONCHO_API_KEY - Make sure you opened Cursor from a terminal where the key is set
- Run
/honcho:setupfor guided diagnostics
- Check
.cursor/mcp.jsonexists in your project root with the correct absolute path - Restart Cursor after changing MCP config
- The hooks.json paths must be absolute (not relative)
- Check Cursor's hook logs for JSON parse errors -- if you see ANSI escape codes in the output, update to the latest version which suppresses TTY output in non-interactive mode
export HONCHO_ENDPOINT="local" # Uses localhost:8000export HONCHO_ENABLED="false"Skills for building AI applications with the Honcho SDK.
| Command | Description |
|---|---|
/honcho-dev:integrate |
Add Honcho to your project |
/honcho-dev:migrate-py |
Migrate Python code to latest Honcho SDK |
/honcho-dev:migrate-ts |
Migrate TypeScript code to latest Honcho SDK |
MIT -- see LICENSE
- Issues: GitHub Issues
- Discord: Join the Community
- X (Twitter): @honchodotdev
- Plastic Labs: plasticlabs.ai
- Honcho: honcho.dev
- Documentation: docs.honcho.dev
