Embeddings and persistent memory
How to give your agent long-term memory with vector embeddings.
What persistent memory does
By default, your agent's context is limited to the current conversation. Persistent memory lets it remember information across sessions, recall past work, and build on previous interactions.
How it works
Le Bureau uses an embedded vector database to store and search your agent's memories. When enabled:
- The agent creates embeddings (numerical representations) of important information during conversations.
- These embeddings are stored on the desktop's persistent disk.
- When you start a new conversation, the agent searches its memory for relevant context.
- Retrieved memories are injected into the conversation, giving the agent continuity.
What you need
Persistent memory requires an embeddings API key and model, separate from your main AI provider key. Embeddings use a specialized model optimized for converting text into vectors.
Supported embeddings providers
| Provider | Example model | Notes |
|---|---|---|
| OpenAI | text-embedding-3-small | Most common, good quality |
| OpenRouter | Various | Access to multiple embedding models |
Setting up persistent memory
During desktop creation
- Click New Desktop.
- Configure your main AI provider (Anthropic, OpenAI, or OpenRouter).
- Expand the Embeddings section.
- Select an embeddings provider (OpenAI or OpenRouter).
- Enter your embeddings API key.
- Choose an embeddings model.
- Create the desktop.
On an existing desktop
- Open your desktop's settings.
- Add or update the embeddings configuration.
- Restart the desktop.
What memory looks like in practice
With memory active, your agent can:
- Remember project context: "We are building a REST API with Express and PostgreSQL."
- Recall past decisions: "Last time, we decided to use JWT for authentication."
- Track progress: "The user asked me to set up three microservices. Two are done."
- Learn preferences: "The user prefers TypeScript over JavaScript."
Without memory, every new conversation starts from scratch.
Where memories are stored
Memories live in ~/.openclaw/workspace/memory/ on the desktop's disk. As long as the desktop is not destroyed, they persist across reboots, stops, and restarts.
You can inspect the memory directory from the terminal:
ls ~/.openclaw/workspace/memory/
Cost
Embeddings API calls are charged by your provider based on tokens processed. The cost is typically very low, a few cents per million tokens for text-embedding-3-small.
Tips
- You can use a different provider for embeddings than for your main model. For example, Anthropic for chat and OpenAI for embeddings.
- If you do not need memory, skip the embeddings configuration. Your agent still works, it just will not remember past sessions.
- Memory quality improves over time as the agent accumulates context about your projects and preferences.
Related docs
Configure AI provider
Set up your API key for Anthropic, OpenAI, or OpenRouter on Le Bureau.
Provider setup
Configure Anthropic, OpenAI, or OpenRouter as your AI provider with your own API key.
OpenClaw overview
What OpenClaw is, how it runs on Le Bureau desktops, and how it connects your chat to an AI model.