Show HN: Llmswap – Solving "Multiple Second Brains" with Per-Project AI Memory

2 points by sreenathmenon 16 hours ago

I kept seeing developers (including myself) struggle with the same problem: "I need multiple second brains for different aspects of my life, but AI keeps forgetting context between sessions."

So I built llmswap v5.1.0 with a workspace system that gives you persistent, per-project AI memory.

How it works:

  - cd ~/work/api-platform → AI loads enterprise patterns, team conventions


  - cd ~/learning/rust → AI loads your learning journey, where you struggled


  - cd ~/personal/side-project → AI loads personal preferences, experiments

Each workspace has independent memory (context.md, learnings.md, decisions.md) that persists across sessions. Your AI mentor actually remembers what you learned yesterday, last week, last month.

Key features:

  • Auto-learning journals (AI extracts key learnings from every conversation)


  • 6 teaching personas (rotate between Guru, Socrates, Coach for different perspectives)


  • Works with ANY provider (Claude Sonnet 4.5, IBM Watsonx, GPT-4 o1, Gemini, Groq, Ollama)


  • Python SDK + CLI in one tool


  • Zero vendor lock-in

Think of it as "cURL for LLMs" - universal, simple, powerful.

The workspace system is what makes this different. No competitor (Claude Code, Cursor, Continue.dev) has per-project persistent memory with auto-learning tracking.

Built for developers who:

  - Manage multiple projects and lose context switching

  - Are tired of re-explaining their tech stack every session

  - Want AI that builds on previous learnings, not starts from zero

  - Need different "modes" for work/learning/side projects

Open to feedback! Especially interested in:

  1. What other workspace features would be useful?

  2. How do you currently manage AI context across projects?

  3. Would you use auto-learning journals?


GitHub: https://github.com/sreenathmmenon/llmswap

PyPI: pip install llmswap==5.1.0

Docs: https://llmswap.org