This is the comparison page where the word "memory" is the most misleading thing in the room.
When Mem0 or Letta says "memory", they mean: the LLM in this loop should remember what happened earlier. A user told the agent they're vegetarian three turns ago — the next turn, the agent shouldn't suggest steak. Across conversations, the agent should remember "this user prefers concise responses". This is genuinely useful. It's a hard problem. Mem0 and Letta are good at it.
When puppyone says "context", we mean: the agent (any agent, not just this one) should have access to a durable, scoped, version-controlled workspace. Not "what did you say three turns ago", but "what does the spec say, what did yesterday's research agent write, what's the canonical version of the customer onboarding doc, what did Cursor commit at 14:32". This is a different problem at a different layer.
You can use Mem0 without puppyone. You can use puppyone without Mem0. Most production setups end up using both, because they solve different failures.
| Dimension | Mem0 / Letta | puppyone |
|---|---|---|
| What "memory" means | What the LLM should recall inside / across conversations | The whole shared workspace agents and humans operate on |
| Lifetime | Short to medium (turns, sessions, sometimes weeks) | Long (months, years; permanent until rolled back) |
| Atomic unit | Memory item / fact / summary / embedding | File, with full content and history |
| Primary user | One agent inside one application's chat loop | Many agents + humans across products and time |
| Storage shape | Vector store + structured memory schema | Filesystem (markdown, JSON, CSV, anything) |
| Native interface | SDK calls (memory.add, memory.search, memory.update) | Bash, MCP, REST, sandbox mount |
| Multi-agent collaboration | Per-agent memory; shared state requires explicit design | Native — same workspace, per-agent path scopes |
| Audit / version control | Limited to memory CRUD logs | Git-style commits per agent identity, full diff, rollback |
| SaaS ingestion | Not the job — you feed it conversation turns | Built-in connectors for Notion, Slack, Gmail, Postgres, etc. |
| Scope | Inside the chat loop | Across the entire agent stack |
| Best at | "Don't forget what the user told you" | "Be the durable substrate every agent reads from and writes to" |
Use Mem0 or Letta when the problem you're solving is conversational continuity inside an agent:
This is a real, hard problem. puppyone is not built for it. We don't have an LLM-aware extraction pipeline, we don't auto-summarise conversation turns, we don't scope memory per end-user. Mem0 and Letta do these well; reach for them.
Use puppyone when the problem is durable, shared, file-shaped context across multiple agents and humans:
If you've been pushing every project artefact into a vector-backed memory store and watching it bloat into an unsearchable mess of summaries — you're using a memory layer for a context job. Different shape needed.
In real production agent stacks:
The clean split: Mem0/Letta = "what the LLM should remember inside the loop"; puppyone = "what the world looks like outside the loop".
Not really, and the differences matter:
core_memory and an editable scratchpad — that's a working memory primitive inside an agent's process, not a workspace shared across many agents and many tools (Cursor, n8n, your custom code) over time.memory.search returning relevant items, not around cat /specs/architecture.md returning the canonical content of a spec.You can use Letta or Mem0 as one of the writers into puppyone — the agent uses Letta to manage its own working memory, then commits the artefacts that matter (final reports, transformed datasets, decisions) into puppyone as files for the rest of the stack.
Vector DBs (Pinecone, Zilliz, FAISS, pgvector) are retrieval primitives. Mem0/Letta build memory abstractions on top of them. puppyone is a different layer: the canonical store of files, with embeddings as a derived index living wherever you want.
We have a separate page on puppyone vs vector databases that goes into detail. The short version: vectors find a document; puppyone stores it.
There's no migration. Mem0 / Letta stay in your agent code.
/research, planner writes /plans, executor writes /output. Mem0/Letta still handle each agent's working memory.After a month, the boundary becomes very clear: ephemeral conversation state in Mem0/Letta; durable, shared, version-controlled files in puppyone.
Does puppyone replace Mem0 or Letta? No. We don't do conversational fact extraction or in-loop memory management. Mem0 and Letta do that well. We sit underneath, holding the long-term shared context.
Can puppyone be used as the storage backend for Letta or Mem0? You can persist memory snapshots to puppyone for audit and version history, yes. We're not a drop-in replacement for their internal storage, though.
Why doesn't puppyone do automatic memory extraction? Because that's a per-agent, per-application concern best handled inside the agent loop. We're trying to be the substrate, not the brain. Adding extraction would either bias what we extract (fitting one agent's needs) or make us a worse general substrate.
My agent has Mem0. Why do I also need puppyone? Because Mem0 holds what the LLM should remember. It does not hold the spec your agent should follow, the codebase your agent should read, the report your agent wrote yesterday and your teammate needs today, or the Slack thread that should become a file. Those are puppyone's job.
Does puppyone have semantic search?
puppyone is the file substrate; embeddings + semantic search are a layer on top, and we make it easy to wire a vector DB (or pgvector) over puppyone's content. We don't ship a built-in vector store because most teams already have one or want to choose their own.
Mem0 / Letta = memory inside the agent loop. puppyone = the world the agent operates in. Don't put one in the slot meant for the other. Most production setups run both, with a clean line between "what we just said" and "what's true".
Give your agent both: working memory inside the loop, durable workspace outside it.Get started