Skip to main content
Available starting with FlowX.AI 5.3.0: Workflow Data Models enable integration between processes and Integration Designer workflows with structured data models at the workflow level, similar to Process Definitions.

Overview

Define data models at the workflow level with input and output parameters, similar to Process Definitions

Workflow Data Models bring the same structured data management capabilities available in Process Definitions to Integration Designer workflows. By defining data models at the workflow level, you ensure consistent data lineage across your integration architecture, making workflows more reusable, testable, and maintainable.

Structured Data Management

Define entities, attributes, and data types at the workflow level, similar to Process Data Models

Input/Output Parameters

Specify which data enters and exits your workflow with automatic parameter management

Process Integration

Map data bidirectionally between processes and workflows using data mappers

Type Safety & Validation

Define data types and validation rules to ensure runtime data integrity

Key concepts

Workflow data model

A Workflow Data Model defines the structure of data used throughout a workflow’s execution. It consists of:
  • Entities: Logical groupings of related data (for example, Customer, Order, Payment)
  • Attributes: Individual data fields with types and constraints
  • Input Slice: Subset of the data model defining workflow input parameters
  • Output Slice: Subset of the data model defining workflow output parameters

Data slices

Data slices are subsets of your data model that define which data is passed as input or returned as output:
  • Input Slice: Automatically pre-fills the Start Node with structured data
  • Output Slice: Defines the data structure returned by End Nodes (planned for future releases)

Node name uniqueness

Node names within a workflow must be unique to ensure clear data lineage and avoid ambiguity when mapping data.
When you rename nodes, the system validates uniqueness. If a duplicate name exists, an index is automatically appended (for example, Transform, Transform_2, Transform_3).

Creating a workflow data model

1

Navigate to Workflow Settings

Open your workflow in Integration Designer and navigate to the Data Model tab in workflow settings.
2

Define Entities

Create entities to represent the logical data structures in your workflow:
  • Click Add Entity
  • Enter an entity name (e.g., Customer, Order)
  • Add a description for documentation
3

Add Attributes

Define attributes for each entity with appropriate data types:Available Data Types:
  • String: Text values
  • Number: Numeric values (integer or decimal)
  • Boolean: True/false values
  • Object: Nested data structures
  • Array: Lists of values
  • Date: Date and time values
Attribute Configuration:
  • Name and description
  • Data type
  • Required/optional flag
  • Default values
  • Validation rules
4

Configure Input Parameters

Define which data will be passed as input to your workflow:
  • Navigate to Input/Output Parameters tab
  • Click Define parameters
  • Select attributes from your data model to include
  • Mark required parameters
Input slice data automatically pre-fills the Start Node when you open the workflow diagram or test at runtime.
5

Save and Test

Save your data model and test the workflow to verify input parameter mapping works correctly.

Input parameter management

Automatic start node pre-filling

When you define an input slice, the Start Node is automatically populated with the structured data: Before (Manual JSON Editing):
{
  "customerId": "",
  "accountNumber": "",
  "amount": 0
}
After (Automatic from Data Model): The input JSON is automatically computed based on your data model input slice. No manual editing required.
You need to refresh the workflow diagram to see the automatic pre-filling.

Benefits of input parameters

No Manual Editing

Input parameters automatically pre-fill the Start Node - no JSON editing needed

Type Safety

Data types are enforced at the workflow level, preventing runtime errors

Clear Contracts

Input parameters document exactly what data the workflow expects

Easier Testing

Test workflows with structured input data based on the data model

Integration with processes

Process to workflow data mapping

Use data mappers to pass data from processes to workflows:
1

Add Start Integration Workflow Action

In your process definition, add a Start Integration Workflow action to the node where you want to invoke the workflow.
2

Configure Input Mapping

Map process data to workflow input parameters:
{
  "customerId": "${application.customer.id}",
  "accountNumber": "${application.account.number}",
  "amount": "${application.transaction.amount}"
}
The keys on the left must match your workflow’s input slice parameters.
3

Add Receive Message Task

Add a Receive Message Task node to capture workflow output when the workflow completes.
4

Configure Output Mapping

Create a data mapper on the Receive Message Task to map workflow outputs back to process data:
  • Navigate to node configuration
  • Add Data Mapper
  • Map workflow output keys to process data model attributes

Example credit check workflow

Process Data Model:
{
  "application": {
    "customer": {
      "id": "12345",
      "name": "John Doe"
    },
    "creditScore": null,
    "creditStatus": null
  }
}
Workflow Input Slice:
  • customerId (String, required)
  • customerName (String, required)
Workflow Output:
  • creditScore (Number)
  • creditStatus (String)
Input Mapping (Process → Workflow):
{
  "customerId": "${application.customer.id}",
  "customerName": "${application.customer.name}"
}
Output Mapping (Workflow → Process):
{
  "application.creditScore": "${output.creditScore}",
  "application.creditStatus": "${output.creditStatus}"
}

Data model API reference

FlowX.AI 5.3 introduces comprehensive REST API endpoints for managing workflow data models programmatically.

Data model endpoints

Get data model

GET /api/integrations/workflows/{workflowId}/data-model
Returns the complete data model definition for a workflow.

Create or update entity

POST /api/integrations/workflows/{workflowId}/data-model/entities
Request Body:
{
  "name": "Customer",
  "description": "Customer information",
  "attributes": [
    {
      "name": "customerId",
      "type": "string",
      "required": true,
      "description": "Unique customer identifier"
    },
    {
      "name": "email",
      "type": "string",
      "required": false
    }
  ]
}

Delete entity

DELETE /api/integrations/workflows/{workflowId}/data-model/entities/{entityName}

Create or update attribute

POST /api/integrations/workflows/{workflowId}/data-model/entities/{entityName}/attributes
Request Body:
{
  "name": "phoneNumber",
  "type": "string",
  "required": false,
  "description": "Customer phone number"
}

Delete attribute

DELETE /api/integrations/workflows/{workflowId}/data-model/entities/{entityName}/attributes/{attributeName}

Get attribute usages

GET /api/integrations/workflows/{workflowId}/data-model/entities/{entityName}/attributes/{attributeName}/usages
Returns all references to the attribute within the workflow.

Data slice endpoints

Get data slices

GET /api/integrations/workflows/{workflowId}/data-slices

Create or update data slice

POST /api/integrations/workflows/{workflowId}/data-slices
Request Body:
{
  "name": "input",
  "type": "INPUT",
  "attributes": [
    "Customer.customerId",
    "Customer.email",
    "Customer.phoneNumber"
  ]
}

Delete data slice

DELETE /api/integrations/workflows/{workflowId}/data-slices/{sliceName}

Import and export endpoints

Export data model

GET /api/integrations/workflows/{workflowId}/data-model/export
Downloads the data model as a portable JSON file.

Import data model

POST /api/integrations/workflows/{workflowId}/data-model/import
Imports a data model from a JSON file.

Best practices

Data model structure

Use Logical Entities

Group related attributes into meaningful entities that represent domain concepts

Consistent Naming

Use camelCase for attributes and PascalCase for entities consistently

Document Everything

Add descriptions to entities and attributes for better team understanding

Keep It Simple

Start with minimal data models and expand as needed

Input and output design

DO: ✅ Define only required input parameters in the input slice
✅ Use meaningful parameter names that describe the data
✅ Set appropriate data types for validation
✅ Mark parameters as required when they’re essential
DON’T: ❌ Include unnecessary data in input/output slices
❌ Use ambiguous or cryptic parameter names
❌ Make all parameters optional when some are required
❌ Change data model structure without updating mappings

Node naming

Use descriptive node names that indicate their purpose:
  • FetchCustomerData, ValidateCreditScore, TransformResponse
  • Node1, Node2, Script
The system enforces unique node names within workflows. If you need similar operations, use descriptive suffixes:
  • FetchCustomerData_Personal
  • FetchCustomerData_Financial

Limitations and compatibility

Current limitations

The following features are not yet available:
  • Output Parameters from End Nodes: End nodes don’t automatically populate from output slices yet (planned for future release)
  • Start Sub-workflow Node: Start sub-workflow nodes work as-is without data model integration
  • Endpoint Schema Integration: Imported Swagger schemas are not integrated with workflow data models yet
  • Database Schema Mapping: Database operation schemas are not mapped to workflow variables yet

Backwards compatibility

Workflow Data Models are fully backwards compatible. Existing workflows continue to work without any changes.
If you have existing workflows:
  • They continue to execute normally without data models
  • You can add data models to existing workflows incrementally
  • No breaking changes to workflow execution or process integration
  • End nodes from existing workflows can be renamed manually for clarity

Migration guide

Adding data models to existing workflows

1

Analyze Current Data Flow

Document what data your workflow currently expects and produces:
  • Review Start Node input JSON
  • Identify data used throughout the workflow
  • Document output data from End Nodes
2

Create Data Model

Build a data model that represents your current data structure:
  • Create entities for major data groupings
  • Add attributes matching your current JSON structure
  • Set appropriate data types
3

Define Input Slice

Create an input slice matching your current Start Node inputs:
  • Select attributes from your data model
  • Mark required parameters
  • Test to ensure automatic pre-fill works correctly
4

Update Process Mappings

If your workflow is invoked from processes, update data mappers to align with the new input slice structure (if needed).
5

Test Thoroughly

Test the workflow end-to-end to verify:
  • Input parameters pre-fill correctly
  • Data flows through nodes as expected
  • Process integration still works
  • Output data is correctly captured


Troubleshooting

Possible causes:
  • Input slice not saved correctly
  • Browser cache issues
Solutions:
  • Save the workflow and refresh the browser
  • Re-open the workflow diagram tab
  • Check that the input slice contains attributes
  • Verify attributes are marked as required if needed
Error message: “Node name must be unique within the workflow”Solution:
  • Choose a different, more descriptive node name
  • The system will suggest appending an index (e.g., _2)
  • Use descriptive suffixes instead (e.g., _Personal, _Financial)
Possible causes:
  • Input slice keys don’t match process data mapper
  • Data types mismatch
Solutions:
  • Ensure data mapper keys match input slice attribute names exactly
  • Verify data types are compatible (string to string, number to number)
  • Check for typos in attribute names
  • Review process data model to workflow data model mapping
Error message: “Entity/Attribute is referenced and cannot be deleted”Solution:
  • Check attribute usages using the API: GET .../attributes/{name}/usages
  • Remove references from data slices first
  • Update workflow nodes that use the attribute
  • Then retry deletion