Overview

Define data models at the workflow level with input and output parameters, similar to Process Definitions
Structured Data Management
Input/Output Parameters
Process Integration
Type Safety & Validation
Key concepts
Workflow data model
A Workflow Data Model defines the structure of data used throughout a workflow’s execution. It consists of:- Entities: Logical groupings of related data (for example, Customer, Order, Payment)
- Attributes: Individual data fields with types and constraints
- Input Slice: Subset of the data model defining workflow input parameters
- Output Slice: Subset of the data model defining workflow output parameters
Data slices
Data slices are subsets of your data model that define which data is passed as input or returned as output:- Input Slice: Automatically pre-fills the Start Node with structured data
- Output Slice: Defines the data structure returned by End Nodes (planned for future releases)
Node name uniqueness
When you rename nodes, the system validates uniqueness. If a duplicate name exists, an index is automatically appended (for example,Transform, Transform_2, Transform_3).
Creating a workflow data model
Navigate to Workflow Settings
Define Entities
- Click Add Entity
- Enter an entity name (e.g.,
Customer,Order) - Add a description for documentation
Add Attributes
- String: Text values
- Number: Numeric values (integer or decimal)
- Boolean: True/false values
- Object: Nested data structures
- Array: Lists of values
- Date: Date and time values
- Name and description
- Data type
- Required/optional flag
- Default values
- Validation rules
Configure Input Parameters
- Navigate to Input/Output Parameters tab
- Click Define parameters
- Select attributes from your data model to include
- Mark required parameters

Save and Test
Input parameter management
Automatic start node pre-filling
When you define an input slice, the Start Node is automatically populated with the structured data: Before (Manual JSON Editing):Benefits of input parameters
No Manual Editing
Type Safety
Clear Contracts
Easier Testing
Integration with processes
Process to workflow data mapping
Use data mappers to pass data from processes to workflows:Add Start Integration Workflow Action
Configure Input Mapping
Add Receive Message Task
Configure Output Mapping
- Navigate to node configuration
- Add Data Mapper
- Map workflow output keys to process data model attributes
Example credit check workflow
Process Data Model:customerId(String, required)customerName(String, required)
creditScore(Number)creditStatus(String)
Data model API reference
FlowX.AI 5.3 introduces comprehensive REST API endpoints for managing workflow data models programmatically.Data model endpoints
Get data model
Create or update entity
Delete entity
Create or update attribute
Delete attribute
Get attribute usages
Data slice endpoints
Get data slices
Create or update data slice
Delete data slice
Import and export endpoints
Export data model
Import data model
Best practices
Data model structure
Use Logical Entities
Consistent Naming
Document Everything
Keep It Simple
Input and output design
DO: ✅ Define only required input parameters in the input slice✅ Use meaningful parameter names that describe the data
✅ Set appropriate data types for validation
✅ Mark parameters as required when they’re essential DON’T: ❌ Include unnecessary data in input/output slices
❌ Use ambiguous or cryptic parameter names
❌ Make all parameters optional when some are required
❌ Change data model structure without updating mappings
Node naming
The system enforces unique node names within workflows. If you need similar operations, use descriptive suffixes:FetchCustomerData_PersonalFetchCustomerData_Financial
Limitations and compatibility
Current limitations
- Output Parameters from End Nodes: End nodes don’t automatically populate from output slices yet (planned for future release)
- Start Sub-workflow Node: Start sub-workflow nodes work as-is without data model integration
- Endpoint Schema Integration: Imported Swagger schemas are not integrated with workflow data models yet
- Database Schema Mapping: Database operation schemas are not mapped to workflow variables yet
Backwards compatibility
- They continue to execute normally without data models
- You can add data models to existing workflows incrementally
- No breaking changes to workflow execution or process integration
- End nodes from existing workflows can be renamed manually for clarity
Migration guide
Adding data models to existing workflows
Analyze Current Data Flow
- Review Start Node input JSON
- Identify data used throughout the workflow
- Document output data from End Nodes
Create Data Model
- Create entities for major data groupings
- Add attributes matching your current JSON structure
- Set appropriate data types
Define Input Slice
- Select attributes from your data model
- Mark required parameters
- Test to ensure automatic pre-fill works correctly
Update Process Mappings
Test Thoroughly
- Input parameters pre-fill correctly
- Data flows through nodes as expected
- Process integration still works
- Output data is correctly captured
Related resources
Integration Designer Overview
Process Data Model
Data Mappers
Start Integration Workflow Action
Troubleshooting
Start Node input is empty after defining input slice
Start Node input is empty after defining input slice
- Input slice not saved correctly
- Browser cache issues
- Save the workflow and refresh the browser
- Re-open the workflow diagram tab
- Check that the input slice contains attributes
- Verify attributes are marked as required if needed
Node name uniqueness validation error
Node name uniqueness validation error
- Choose a different, more descriptive node name
- The system will suggest appending an index (e.g.,
_2) - Use descriptive suffixes instead (e.g.,
_Personal,_Financial)
Data mapper not working after adding data model
Data mapper not working after adding data model
- Input slice keys don’t match process data mapper
- Data types mismatch
- Ensure data mapper keys match input slice attribute names exactly
- Verify data types are compatible (string to string, number to number)
- Check for typos in attribute names
- Review process data model to workflow data model mapping
Cannot delete entity or attribute
Cannot delete entity or attribute
- Check attribute usages using the API:
GET .../attributes/{name}/usages - Remove references from data slices first
- Update workflow nodes that use the attribute
- Then retry deletion

