Skip to main content
Available starting with FlowX.AI 5.6.0Memory capabilities are available in Chat Driven workflows only.

Overview

Session memory allows AI nodes in conversational workflows to access previous messages from the current conversation. When enabled, FlowX automatically retrieves conversation history and injects it into the LLM’s context — giving the AI agent awareness of what was discussed earlier in the session. Memory is per-session (identified by Chat Session ID) and managed entirely by FlowX. You don’t need to build any memory retrieval logic — just toggle it on per node.

Automatic retrieval

Latest 3 message pairs + summary of earlier exchanges retrieved on each message

Smart summarization

Older conversation history is automatically summarized to fit within context limits

Per-node control

Turn on memory per Custom Agent or Intent Classification node with the Use Memory toggle

Memory tab in console

View retrieved memory (raw turns + summary) in the workflow console log

How memory works

1

User sends a message

The Chat component sends the message + chatSessionId to the Chat Driven workflow.
2

Memory retrieval

The Start node retrieves conversation history for the session: the latest 3 user/agent message pairs in full, plus a summary of all earlier exchanges.
3

Context injection

For each node with Use Memory enabled, the retrieved history is appended to the LLM system prompt, giving the AI agent context from prior turns.
4

Response and storage

After the AI generates a response, the system stores both the user message and the agent response, updating the session memory for the next turn.
5

Summary update

When the conversation exceeds 3 turns, older messages are automatically summarized. The summary is regenerated on each new message beyond the threshold.

Memory structure

The memory injected into the LLM prompt follows this structure:
user: [message content]
agent: [response content]
user: [message content]
agent: [response content]
user: [message content]
agent: [response content]
Summary of previous replies: [generated summary]
  • Recent turns — The latest 3 user/agent message pairs are included in full
  • Summary — A compressed summary of all conversation history before the last 3 turns
Only user messages and agent responses are included in memory. Internal workflow data, routing decisions, and intermediate node outputs are not part of the memory.

Enabling memory

Custom Agent node with Use Memory toggle enabled
Memory is controlled per node using the Use Memory toggle. It’s available on:
  • Custom Agent nodes — for context-aware AI responses
  • Intent Classification nodes — for more accurate classification using conversation history
Default: OFF When enabled, the node sends the chatSessionId to the AI platform, which retrieves and attaches the conversation history to the LLM call.
Memory is only available in Chat Driven workflows. The toggle does not appear on nodes in Output Focused workflows.

Summarization

When a conversation exceeds 3 message turns, FlowX automatically summarizes the older messages using an LLM call. The summarization process:
  1. Collects all messages older than the most recent 3 turns
  2. Sends them to the LLM with a context extraction prompt
  3. Stores the resulting summary for future retrieval
  4. The summary is regenerated on each new message to incorporate the latest context
The summarization prompt instructs the LLM to extract the most important and relevant context from the conversation history, focusing on information needed to maintain coherent dialogue.
Summarization happens automatically — there is no configuration needed. The summary prefix ## Previous conversation summary: is prepended to the summary text in the prompt.

Debugging memory in the console

Memory tab in the workflow console showing conversation summary and message history
The workflow console log includes a Memory tab that shows exactly what memory was sent to the LLM for each workflow run.
LevelTabContent
WorkflowMemory tabFull conversation history and summary sent to the LLM
WorkflowInput tabUser Message and Chat Session ID in JSON format
WorkflowOutput tabChat response as text
Custom Agent nodeOutput tabNode response as text

Memory use summary

When an AI node with Use Memory enabled completes, the console displays the memoryUseSummary — a snapshot of the conversation context that was available to the LLM. This includes:
  • summary — the auto-generated summary of earlier conversation history
  • chatMessages — the most recent message pairs (up to 3, excluding the current message) with their messageId, userMessage, and agentMessage
{
  "summary": "User asked about account setup and verification steps.",
  "chatMessages": [
    {
      "messageId": "msg-003",
      "userMessage": "What documents do I need?",
      "agentMessage": "You need a valid ID and proof of address."
    },
    {
      "messageId": "msg-002",
      "userMessage": "How do I verify my account?",
      "agentMessage": "Navigate to Settings and select Verification."
    }
  ]
}
The Memory tab only appears when memoryUseSummary data exists for the workflow instance. If you don’t see it, verify that Use Memory is enabled on at least one node.

Storage

DataStorageDetails
Session IDBrowser storage + FlowX DatabaseLinks the client chat instance to the server session
Message historyFlowX DatabaseComplete record of user and agent messages per session
Conversation summaryFlowX DatabaseAuto-generated summary of older conversation turns
Session metadataFlowX DatabaseTimestamps, workflow reference, user info
Session memory is tied to the chatSessionId — the same session ID retrieves the same memory across workflow runs. The Chat component manages session IDs automatically.

Limitations

  • Memory is session-scoped — there is no cross-session or cross-user memory
  • The latest 3 turns are always included in full; older turns are summarized
  • Summarization uses an LLM call, which adds latency on the first message after the 3-turn threshold
  • Memory cannot be manually edited or cleared from the Designer UI
  • Only user messages and agent responses are stored — internal workflow data is excluded

Conversational workflows

Full guide to building Chat Driven workflows with memory and intent routing

Chat component

Runtime behavior, session management, and display modes

Intent Classification

Route conversations based on detected user intent with optional memory

Custom Agent node

Configure AI nodes with memory, chat reply, and response settings
Last modified on March 25, 2026