Documentation Index
Fetch the complete documentation index at: https://docs.flowx.ai/llms.txt
Use this file to discover all available pages before exploring further.
Overview
Inside a BPMN process, every node, including subprocess, integration workflow, AI agent, business rule, and user task, reads from and writes to a shared store of process variables (also called process instance data in code). This page explains the mental model: where each node type’s output lands, and how the next node, or a user task screen, reads it. For the configuration syntax of any specific mapping, follow the links to the dedicated pages.If you are calling FlowX from an external app (React frontend, another backend service) and want to fetch a process result over HTTP, that is a different question. See Consuming FlowX from external apps.
The mental model
A process variable is a key on the process instance data store. Every node configures:- Input mapping: which existing process variables to send into the node, or how to compute them.
- Output mapping or result key: where the node’s output is written back into the process variables.
Choose the right pattern
| If you need to… | Use this node | Where its output lands | Reference |
|---|---|---|---|
| Run a stateful, multi-step BPMN flow with its own user tasks | Subprocess (call activity) | Mapped into parent variables via Data Mapper | Data mappers |
| Call an integration workflow (REST, DB, AI ops) and capture the result | Send Message Task with Start Integration Workflow action | Written to the configured Result key, then a Receive Message Task unblocks the process | Start Integration Workflow |
| Call an AI agent (extraction, decision, generation) | Service Task with the agent integration | Mapped back to process variables in the action’s output mapping | BPMN integration with agents |
| Compute or transform values inline | Business Rule action | Writes to keys passed to output.put(key, value) | Business rule action |
The Receive Message Task pattern (workflows)
This is the part most people miss the first time. When a process triggers an integration workflow with Start Integration Workflow, the process must wait for the workflow to finish before reading its output. To wait, place a Receive Message Task after the Send Message Task that triggers the workflow.Add a Send Message Task
Configure it with the Start Integration Workflow action. Set the Result key: this is the process-variable path where the workflow’s output will be written.
Add a Receive Message Task immediately after
The process pauses on this node until the workflow completes. When it completes, the workflow’s output lands at the result key.
For AI agents: the integration is typically wired on a Service Task, and the agent’s response is mapped back to process variables in the action’s output mapping. No separate Receive Message Task is required. See BPMN integration with agents.
Reading process variables downstream
Once a value is in process variables, the way you reference it depends on where you are reading from:- In another node’s input mapping (Send Message Task action, agent input, etc.): use the placeholder syntax documented on that node’s page. Example for workflow input mapping:
${processInstanceData.customer.id}(see Start Integration Workflow → Input mapping). - In a subprocess’s Data Mapper: use
${variableName}against the subprocess’s defined Input Parameters. The Data Mapper is the canonical interface for parent ↔ subprocess data flow (see Data mappers). - In a business rule script: read from
input(the keys you mapped in) and write tooutputviaoutput.put("keyName", value). See Business rule action. - In a gateway condition: reference the same process variables to choose a path.
- In a user task screen: bind UI components to process data. The value flows automatically once the variable exists in process instance data.
End-to-end example
A loan-onboarding process that uses an integration workflow to fetch a credit score, then an AI agent to extract data from an uploaded document, then a user task to confirm the final summary.User uploads document on a user task
The file id lands on process variables, for example at
loan.documentId.Send Message Task triggers a credit-check integration workflow
Input mapping reads
loan.customerId. Result key set to creditCheck.Receive Message Task waits for the workflow
When the workflow completes, its response lands at
creditCheck in process variables.Service Task calls the document-extraction agent
Input mapping sends
loan.documentId. Output mapping writes the extracted fields to loan.extracted.Common pitfalls
Reading a workflow result before the Receive Message Task
Reading a workflow result before the Receive Message Task
The result key only contains data after the Receive Message Task unblocks. A node placed between Send and Receive will see an empty value.
Result key collision
Result key collision
Two integrations writing to the same result key will overwrite each other. Use distinct, descriptive paths (
creditCheck, addressVerification) instead of generic ones (result, data).Wrong placeholder prefix
Wrong placeholder prefix
${variableName} works in subprocess Data Mappers; ${processInstanceData.path} works in integration node config. Mixing them produces unresolved placeholders at runtime. Follow the syntax shown on each node’s page.Expecting a synchronous result back to an external HTTP caller
Expecting a synchronous result back to an external HTTP caller
Starting a process via REST returns metadata (process instance UUID, state), not the process’s business output. To surface results to an external app, use the SDK, a Service Task callback, or a Kafka event. See Consuming FlowX from external apps.
Related resources
Data mappers
Parent ↔ subprocess parameter mapping
Start Integration Workflow
Workflow input/output and the Receive Message Task pattern
BPMN integration with agents
Calling AI agents from a process
Business rule action
Inline scripts that read from
input and write to outputSend/Receive Message Task
Node-level configuration of the wait pattern
Consuming FlowX from external apps
External HTTP/SDK integration patterns

