Conversation Memory
Conversation memory is memory scoped to a single conversation or session — the running context of the current dialogue. It is cleared when the session ends unless the application explicitly persists it elsewhere. It is distinct from long-term memory (which survives across sessions) and from working memory (which is the model's in-context window). In LLM agents this is typically implemented as the message array passed to each completion call, optionally pruned or summarized as the conversation grows. The session-level scoping in mem0 (`run_id`) and the recall storage in Letta both serve this layer.
Example
A customer-support chat that loads "you are speaking with Alex who emailed earlier today" plus the prior 12 turns of conversation into the prompt for each new turn. When Alex closes the browser tab and returns tomorrow, the session ends and the conversation memory is gone — unless the app persisted a summary into long-term memory before the session closed.
Put this into practice
Build polished, copy-ready prompts in under 60 seconds with SurePrompts.
Try SurePrompts