Skip to main content
Available starting with FlowX.AI 5.6.0Conversational workflows require the Chat component for user interaction.

Overview

Conversational workflows are a specialized workflow type designed for multi-turn chat interactions. Unlike output focused workflows that process structured input/output, conversational workflows manage ongoing dialogue between users and AI agents — handling message exchange, session memory, and response routing. When creating a workflow in the Integration Designer, you choose the workflow type: Chat Driven or Output Focused. This choice is permanent and cannot be changed after creation.

Session memory

Automatically persist and retrieve conversation history across messages within a session

Dedicated Start node

The Start node provides Chat Session ID and User Message fields for receiving chat input

Chat replies

AI agent nodes send responses directly to the Chat component in real time

Intent routing

Classify user messages and route to appropriate workflow branches using the Intent Classification node

Chat driven vs output focused workflows

AspectChat DrivenOutput Focused
PurposeMulti-turn dialogue with usersStructured input/output processing
Start nodeChat Session ID + User Message fieldsStandard Start node (JSON input)
MemoryBuilt-in session memoryNo memory
Response deliveryDirect chat reply from Custom Agent nodesOutput on End node
Data modelInput/Output tabs hiddenFull data model access
IntegrationChat component onlyProcess actions, subworkflows, API
The workflow type cannot be changed after creation. Choose the appropriate type when creating the workflow.

How it works

1

User sends a message

The Chat component sends the user’s message and session ID to the workflow.
2

Memory retrieval

The Start node retrieves session memory — the latest 3 message turns plus a summary of earlier conversation history.
3

AI processing

The workflow processes the message through AI nodes. Nodes with Use Memory enabled receive the conversation history as context for their LLM calls.
4

Response delivery

The Custom Agent node with Send as Chat Reply enabled sends its response directly to the Chat component in Markdown format.
5

Memory update

The system stores the user message and AI response in session memory, updating the conversation summary if needed.

Start node

In Chat Driven workflows, the Start node provides two dedicated input fields instead of the standard JSON editor:
FieldDescriptionNotes
Chat Session IDUnique session identifier managed by the Chat componentRequired. Used for memory retrieval
User MessageThe user’s message textRequired. Can be referenced in other nodes using ${userMessage}
Both fields are populated automatically by the Chat component at runtime. When testing the workflow manually (via Run Workflow), you enter values in these fields directly.
The Start node in Chat Driven workflows is not a separate node type — it is the same Start node with a different layout tailored for chat input.

Custom Agent node

In Chat Driven workflows, the Custom Agent node has additional configuration options. The full layout from top to bottom:

Operation Prompt

The system prompt for the LLM. Use ${userMessage} to reference the user’s message.

Use Memory

When enabled:
  • The node includes the conversation history (retrieved via session ID) in the LLM prompt
  • Memory consists of the latest 3 message turns plus a summary of earlier messages
  • The session ID is sent to the AI platform, which attaches the conversation context to the prompt

Settings

  • MCP Servers — Select MCP tools available to the agent
  • Knowledge Base — Connect a knowledge base for RAG-powered responses

Response

Send as Chat Reply

When enabled:
  • The node’s output is sent directly to the Chat component as a Markdown-formatted response
  • The Response Schema field is hidden (the LLM is instructed to return plain text)
  • A Chat Response tag appears on the node header
  • The response triggers a memory update (stores the user message + AI reply and iterates on the conversation summary)
At least one Custom Agent node in the workflow must have Send as Chat Reply enabled. If no node sends a chat reply, the console log displays an error.

Response Key

Always visible. Defines the key where the node output is stored in the workflow data.

Response Schema

Only visible when Send as Chat Reply is OFF. Defines the expected JSON structure of the LLM response.

Session memory

Chat Driven workflows use built-in session memory stored and managed by FlowX. Memory is structured as:
  • Recent turns — The latest 3 user/agent message pairs in full
  • Summary — A compressed summary of all earlier conversation history
The memory structure sent to the LLM:
user: [message content]
agent: [response content]
user: [message content]
agent: [response content]
user: [message content]
agent: [response content]
Summary of previous replies: [generated summary]
The summarization runs automatically when the conversation exceeds 3 turns. Only user messages and agent responses are included — internal workflow data is not part of the memory.

Console log

The workflow console log includes additional information for Chat Driven workflows:
  • Input tab — Displays User Message and Chat Session ID in JSON format (read-only)
  • Output tab — Displays the chat response as text (not in JSON editor for readability)
  • Memory tab — Shows the conversation history and summary sent to the LLM for that workflow instance

Constraints

Chat Driven workflows cannot be referenced as subworkflows. The subworkflow node filters out Chat Driven workflows from the selection list.
The Start Integration Workflow action in processes filters out Chat Driven workflows. They can only be started through the Chat component.
Running a Chat Driven workflow without a user message triggers an error: “The user message is mandatory in conversational workflows.”
A Chat Driven workflow requires an End Flow node to complete the execution path. The End Flow node is simplified (header only, no body configuration) since responses are sent from Custom Agent nodes. The End Flow node is not auto-created — you must add it manually from the node palette.
Only Chat Driven workflows can be integrated into the Chat component. Output Focused workflows are filtered out from the Chat component workflow selection.

Testing conversational workflows

When testing via Run Workflow, the test modal provides two input fields:
  • User Message — The test message to send
  • Chat Session ID — A test session identifier (used for memory)
These replace the standard JSON editor used in Output Focused workflows.

Setting up a conversational workflow

1

Create a new workflow

In the Integration Designer, click + to create a new workflow. Enter a name and select Chat Driven as the workflow type.
2

Review the Start node

The Start node is created automatically with Chat Session ID and User Message fields pre-configured.
3

Add AI processing nodes

Add Custom Agent nodes or Intent Classification nodes to process user messages.
4

Enable Chat Reply

On the Custom Agent node that generates the final response, toggle Send as Chat Reply to ON.
5

Enable Memory (optional)

On nodes that need conversation context, toggle Use Memory to ON.
6

Add an End Flow node

Add an End Flow node from the node palette and connect it to the final node in your workflow. The End Flow node has no body configuration in Chat Driven workflows.
7

Integrate with Chat component

In your UI Flow, add a Chat component and select the Chat Driven workflow.

Last modified on March 16, 2026