Memory & Knowledge
Agents need to "remember" previous turns and access "external data" to provide high-quality answers.
1. Conversation Memory
Use persistConversation to give your agent a history. It handles storing, trimming, and summarizing previous turns automatically.
export const supportAgent = new AgentBuilder({ ... })
.persistConversation('user') // Use preset: 'user' or 'agent'Presets
| Preset | Strategy | Frame Budget | Best For |
|---|---|---|---|
user | Full | 40 | Interactive chats where recent turns matter. |
agent | Summary | 20 | Long-running tasks where tokens are expensive. |
Accessing History
In your handler, context.conversation provides the API to manage history:
setHandler(async (context, payload) => {
await context.conversation.addUser(payload.prompt)
const messages = await context.conversation.getMessages()
// ...
await context.conversation.addAssistant(answer)
})2. External Knowledge (RAG)
Knowledge adapters allow your agent to access external documents like FAQs, Wikis, or Vector databases.
Defining Knowledge
export const supportAgent = new AgentBuilder({ ... })
.useKnowledgeAdapter('supportFaq')Querying Knowledge
setHandler(async (context, payload) => {
const docs = await context.knowledge.supportFaq.query(payload.prompt, 3)
})PURISTA automatically scopes queries by tenantId, principalId, agentName, agentVersion, and sessionId to ensure data separation.
3. Configuration at Runtime
Both memory and knowledge are "in-memory" by default. For production, inject persistent stores during bootstrap.
const instance = await supportAgent.getInstance(eventBridge, {
conversationStore: new RedisConversationStore(),
knowledgeAdapters: {
supportFaq: new PineconeAdapter()
}
})Conversation stores follow the same pattern: the runtime keeps the logical conversationId stable and passes tenant/user/agent metadata separately to the store implementation so custom backends can build their own compound keys without guessing how PURISTA scoped the id.
Need something custom?
If you need to build your own store or adapter, see the Custom AI Stores & Adapters section in the advanced handbook.
