Memory System
OfficeLLM includes a built-in memory system that allows you to persist conversation history from both managers and workers. The memory system is designed to be extensible, allowing you to add custom storage backends easily.Overview
The memory system automatically stores:- All manager conversations (task coordination)
- All worker conversations (tool executions)
- Message history for each conversation
- Metadata (timestamps, agent info, provider details)
Built-in Storage Options
In-Memory Storage
The simplest option - stores conversations in memory during runtime. Data is lost when the application stops.- Development and testing
- Single-session applications
- When persistence isn’t required
- Quick prototyping
maxConversations(optional): Maximum number of conversations to store. When limit is reached, oldest conversations are removed.
Redis Storage
Persistent storage using Redis. Conversations survive application restarts and can be shared across instances.- Install the Redis client:
npm install redis - Run Redis server (e.g.,
docker run -p 6379:6379 redis)
- Production applications
- Multi-instance deployments
- When conversation history needs to persist
- Sharing conversations across services
host: Redis server hostnameport: Redis server portpassword(optional): Authentication passworddb(optional): Redis database number (default: 0)keyPrefix(optional): Prefix for all Redis keys (default: ‘officellm:conv:’)ttl(optional): Time-to-live for conversations in seconds
Using Memory
Accessing Memory
Querying Conversations
Retrieving Specific Conversations
Getting Statistics
Cleanup
Conversation Structure
Each stored conversation has the following structure:Creating Custom Memory Providers
You can create custom storage backends by extendingBaseMemory:
Best Practices
1. Always Close Connections
2. Handle Memory Errors Gracefully
Memory operations are wrapped in try-catch blocks internally, so your application won’t crash if memory storage fails. However, you should monitor logs for memory-related errors.3. Use TTL for Redis
Set a TTL (time-to-live) for Redis conversations to prevent unbounded growth:4. Implement Pagination
When querying large datasets, always use pagination:5. Monitor Memory Usage
For in-memory storage, be mindful of memory limits:Example: Complete Usage
Troubleshooting
Redis Connection Errors
- Ensure Redis is running:
docker run -p 6379:6379 redis - Install redis package:
npm install redis - Check connection details (host, port, password)
- Verify firewall settings
Memory Not Storing Conversations
If conversations aren’t being stored:- Check that memory is configured in OfficeLLMConfig
- Verify the memory instance is accessible:
office.getMemory() - Check application logs for memory-related errors
- Ensure proper cleanup with
await office.close()
Performance Issues
For large conversation histories:- Use Redis instead of in-memory storage
- Implement pagination when querying
- Set appropriate TTL values
- Consider archiving old conversations
- Use specific queries instead of fetching all conversations