Skip to main content
Available starting with FlowX.AI 5.3.0: Update Process Variables allows operations teams to modify process variables on active instances to resolve production issues without developer intervention.

Overview

Update Process Variables allows you to modify process variables on active process instances. You can edit, add, or delete variables on process instances with Started or On Hold status.
This feature is part of operational process instance management, allowing teams to resolve data issues without developer intervention.
Process Variables View
The editor provides:
  • JSON syntax validation and formatting
  • Edit, add, or delete operations on variables
  • Automatic synchronization to Task Manager and Elasticsearch
  • Audit logging with snapshots of previous values

Features

JSON editor

The JSON editor offers:
  • JSON syntax validation
  • Syntax highlighting
  • Auto-formatting
  • Error detection
Process Variables View

Edit process variables using the JSON editor with JSON validation

Access points

Access the editor from two locations:
  • From Process Instances page
  • From View process instance page
  1. Navigate to ProcessesActive ProcessProcess instances
  2. Click the contextual menu (three dots) on any process instance row
  3. Select Modify Variables
  4. The system navigates to the Process Instance page and opens the Variables tab in edit mode
Process Variables View

Edit process variables using the JSON editor with JSON validation

Dual view modes

Switch between two viewing modes for process variables:
  • Tree View (existing): Hierarchical display of variables with expand/collapse
Tree View

Tree view of process variables

  • JSON View: Raw JSON format, editable when in edit mode
JSON View

JSON view of process variables

The Export button is available near the view dropdown for downloading process variables.

Operations

You can:
  • Edit existing attribute values
  • Add new attributes or objects
  • Delete existing attributes or objects
All operations are performed in the JSON editor with validation before saving.

Access requirements

1

Check permissions

Ensure you have the process_variables_edit permission at the workspace level.
The Edit option is only visible to users with this permission.
2

Navigate to process instance

Access from either:
  • Process Instances list → contextual menu
  • Process Instance detail page → secondary navigation
3

Open Variables tab

Click Edit Variables from the contextual menu. The Variables tab opens in edit mode.

Editing process variables

1

Select active instance

Choose an active process instance with Started or On Hold status.
Only process instances in Started or On Hold status can have their variables modified.
2

Switch to JSON View

In the Variables tab, switch from Tree View to JSON View using the view dropdown selector.
3

Enable edit mode

Click the Edit button to make the JSON editor editable.
This button is only visible if you have the process_variables_edit permission.
4

Modify variables

Edit process variables in the JSON editor:Example - Editing existing values:
{
  "application": {
    "customer": {
      "id": "12345",
      "name": "John Doe",
      "email": "[email protected]"
    },
    "status": "approved"
  }
}
Example - Adding new attributes:
{
  "application": {
    "customer": {
      "id": "12345",
      "name": "John Doe",
      "email": "[email protected]",
      "phoneNumber": "+1234567890"
    },
    "status": "approved",
    "creditScore": 750
  }
}
Example - Deleting attributes:Remove unwanted attributes by deleting their lines from the JSON.
5

Validate JSON

Ensure your JSON format is valid before saving.
You cannot save if the JSON format is invalid. The JSON editor will highlight syntax errors automatically.
Common validation errors:
  • Missing commas between properties
  • Unclosed brackets or braces
  • Missing quotes around strings
  • Trailing commas
6

Save changes

Click Save to apply your changes.The system automatically:
  • Creates a snapshot of previous values in a new table
  • Updates the process instance variables
  • Syncs to Task Manager (using existing mechanisms)
  • Updates Elasticsearch (for variables configured for data search)
  • Records the modification in the audit log
7

Review audit log

Navigate to the Audit Log tab to confirm the modification was recorded with all details.

Permissions and security

Access control: This feature requires the workspace-level permission wks_process_instance_variables_edit (permission name: process_variables_edit). Users with this permission can edit and save variables; others can only view variables in read-only mode. By default, ORG_ADMIN and WORKSPACE_ADMIN roles include this permission.Security features: All modifications are logged in the audit trail with user identity and timestamp. Previous values are preserved in snapshot history before changes are applied. JSON syntax validation prevents data corruption.For more information about permissions, roles, and access management, see Workspaces Access Rights and the Permission Reference Guide.

Data synchronization

Updated automatically

When you modify process variables, the following are updated:

Process instance parameters

Variables are updated immediately in the database

Task Manager

Variables sync using existing mechanisms; task keywords and data remain consistent

Elasticsearch

Variables configured for data search are updated automatically

Snapshot table

Previous values are saved in a snapshot table before changes are applied

Not updated

The following are preserved to maintain data integrity:

Process snapshots

Snapshots used for back navigation (Reset process data) remain unchanged

Debug records

Process instance debug records preserve execution history as it occurred

Existing audit logs

Previously recorded audit logs (manual actions, integrations) are not modified

Subprocess variables

Child subprocess variables are not updated; parent changes don’t cascade to children

Synchronization details

Task manager sync

Elasticsearch sync


Audit logging

Audit log structure

Every process variable modification creates a detailed audit log entry with the following information:
FieldDescriptionExample
WorkspaceWorkspace ID where modification occurredws-12345
FeatureFeature being usedProcess variables
SectionSection within the featureProcess Instance
App IDApplication UUIDapp-uuid-123
Application NameName of the appLoan Application
SubjectWhat was modifiedProcess variables
EventType of eventEdit
Subject IdentifierProcess instance UUIDpi-uuid-456
BodyJSON with new process parametersComplete JSON object
UserUser who made the change[email protected]
TimestampWhen the change occurred2025-01-15T10:30:00Z

Viewing audit logs

Audit logs are displayed in the Audit Log tab within the process instance:
1

Open process instance

Navigate to the process instance that was modified.
2

Go to Audit Log tab

Click the Audit Log tab to view all events for this instance.
3

Find modification entry

Look for entries with Event = “Edit” and Subject = “Process variables”.
4

Review details

Click on the audit log entry to see:
  • Complete before/after values (in Body field)
  • User who made the change
  • Exact timestamp
  • All metadata

Compliance and traceability

Who

User identity

What

Complete JSON body

When

Timestamp

Where

Workspace, app, instance

Previous values

Snapshot history

Important considerations

Subprocess impact

Limitation: When modifying parent process variables after subprocesses have started, child processes retain their original values. Variables are not propagated to existing subprocesses.

Back navigation

Limitation: Process snapshots used for “Allow back to this action” with “Reset process data” are not updated. Users navigating back will see original values, not modified ones.

Integration messages

Limitation: If a token is waiting for an integration message, the integration was called with original values. Modifying variables after integration calls does not re-trigger integrations.

Process instance debug table

Not updated: the process instance debug table records remain unchanged. Reason: debug records capture execution history as it occurred. Modifying them would compromise debugging and troubleshooting capabilities.

Best practices

When to use

Correct user input errors

Fix incorrect data submitted in forms

Unblock stuck instances

Resolve missing data preventing continuation

Fix integration data issues

Correct incorrect responses from external systems

Add missing attributes

Add required data not collected initially

Remove invalid data

Clean up test or incorrect data

Update search keywords

Correct Task Manager search data

Fix Elasticsearch data

Ensure accurate search results

When not to use

Regular operational updates

Design proper process flows instead

Subprocess coordination

Parent-child sync not supported

Complex back navigation flows

Snapshots won’t reflect changes

While waiting for integrations

Responses based on original values

Routine data corrections

Indicates process design issues

Automated scripts

Use process design and actions instead

Validation before saving

Verify JSON syntax

Ensure valid JSON format to prevent data corruption and save failures

Review impact

Understand which variables you’re changing and their impact on process flow

Check dependencies

Review process flow, subprocesses, and integrations before modifying critical variables

Document changes

Audit logs record changes automatically, but document why changes were made

Testing and validation

1

Test in lower environments first

When possible, test variable modifications in staging or UAT before production.
2

Verify Elasticsearch sync

If variables are used in data search, confirm updates synced successfully by performing a search.
3

Check Task Manager

Verify task keywords updated if variables affect task search or filtering.
4

Monitor process flow

Watch the process after modification to ensure it continues executing as expected.
5

Review audit log

Confirm the modification was logged correctly with all expected details.

Operational guidelines

  • Before modifying
  • During modification
  • After modification

Review process state

Check current process state and token position

Check subprocesses

Verify if there are running subprocesses

Verify integrations

Ensure no pending integration responses

Understand navigation

Review back navigation points

Document reason

Record why modification is needed

Common use cases

1. Correct user input errors

Scenario:
Customer submitted incorrect loan data (wrong income amount, invalid address, incorrect contact information).
Solution:
  1. Navigate to the process instance
  2. Access Variables tab in edit mode
  3. Locate the incorrect values in JSON
  4. Correct the data:
    {
      "application": {
        "income": 75000,  // Changed from 50000
        "address": "123 Main St, New York, NY 10001"  // Corrected
      }
    }
    
  5. Save changes
  6. Process continues without requiring customer to restart the process

2. Unblock stuck process instances

Scenario:
Process is waiting for external data that never arrives due to integration timeout or external system being down indefinitely.
Solution:
  1. Identify the variable causing the blockage
  2. Open process variables in edit mode
  3. Update the variable to provide expected data or bypass condition:
    {
      "integration": {
        "responseReceived": true,  // Manually set to true
        "creditCheckStatus": "completed"  // Provide expected value
      }
    }
    
  4. Save changes
  5. Instance unblocks and continues execution

3. Fix integration data issues

Scenario:
External credit check system returned incorrect or incomplete data due to system error.
Solution:
  1. Review the integration response variables
  2. Edit variables to correct the data:
    {
      "creditCheck": {
        "score": 720,  // Corrected from incorrect 520
        "status": "approved",  // Fixed status
        "provider": "ExternalCreditBureau"
      }
    }
    
  3. Save changes
  4. Process continues with accurate data

4. Add missing attributes

Scenario:
Process needs additional data that wasn’t collected at start time, but is now required for a decision point.
Solution:
  1. Access process variables
  2. Add new required attributes:
    {
      "application": {
        "existingField": "value",
        "newlyRequiredField": "new value",  // Added
        "additionalData": {  // Added entire object
          "field1": "value1",
          "field2": "value2"
        }
      }
    }
    
  3. Save changes
  4. Process can now access the missing data at decision points

5. Delete invalid data

Scenario:
Process contains test data or incorrect objects that shouldn’t be there and are causing validation failures.
Solution:
  1. Open process variables
  2. Remove problematic data:
    {
      "application": {
        "validData": "correct value"
        // Removed "testData": {...}
        // Removed "invalidObject": {...}
      }
    }
    
  3. Save changes
  4. Process continues with clean, valid data only

6. Update task manager keywords

Scenario:
Task search keywords need updating based on corrected customer data for better task discoverability.
Solution:
  1. Modify variables that are configured as Task Manager keywords:
    {
      "customer": {
        "name": "Jane Smith",  // Corrected
        "accountNumber": "ACC-98765",  // Updated
        "status": "active"
      }
    }
    
  2. Save changes
  3. System automatically syncs to Task Manager
  4. Tasks become searchable with correct keywords

7. Fix Elasticsearch search data

Scenario:
Process data indexed for search contains errors, making instances unsearchable or showing in wrong search results.
Solution:
  1. Identify variables configured for Elasticsearch data search
  2. Correct the values:
    {
      "searchableData": {
        "category": "loans",  // Corrected from "loans_test"
        "priority": "high",  // Fixed
        "customerSegment": "premium"  // Updated
      }
    }
    
  3. Save changes
  4. Elasticsearch automatically updates
  5. Process instances appear in correct search results

8. Emergency business rule changes

Scenario:
Regulatory change requires immediate update to pricing or terms in active applications.
Solution:
  1. Access affected process instances
  2. Update variables to reflect new business rules:
    {
      "pricing": {
        "interestRate": 3.5,  // Updated per regulation
        "fees": {
          "processing": 50,  // Adjusted
          "regulatory": 25   // New fee added
        }
      }
    }
    
  3. Save changes for all affected instances
  4. Processes continue with compliant data

Troubleshooting

Symptoms:
  • Cannot see Edit option in contextual menu
  • Variables tab shows read-only view only
Possible causes:
  • User doesn’t have process_variables_edit permission
  • Process instance is not active (not Started or On Hold)
  • Wrong tab selected
Solutions:
  1. Verify user has process_variables_edit permission:
    • Check workspace permissions
    • Confirm user role includes this permission
    • Contact workspace administrator if needed
  2. Check process instance status:
    • Only Started or On Hold instances can be edited
    • Finished, Failed, or Terminated instances cannot be modified
    • Verify status in process instance details
  3. Ensure you’re in Variables tab:
    • Switch to Variables tab if in another tab
    • Refresh the page if needed
Symptoms:
  • Save button disabled or shows error
  • Red error indicators in editor
  • Error message: “Invalid JSON format”
Common JSON errors:Missing comma:
{
  "field1": "value1"  // ❌ Missing comma here
  "field2": "value2"
}
Correct:
{
  "field1": "value1",  // ✅ Comma added
  "field2": "value2"
}
Trailing comma:
{
  "field1": "value1",
  "field2": "value2",  // ❌ Trailing comma
}
Unclosed brackets:
{
  "object": {
    "field": "value"
  // ❌ Missing closing }
}
Solutions:
  1. Use JSON editor error highlights to locate issues
  2. Copy JSON to external validator (jsonlint.com)
  3. Check for common issues: commas, brackets, quotes
  4. Validate structure matches JSON specification
  5. Remove trailing commas
  6. Ensure all brackets and braces are closed
Symptoms:
  • Task Manager still shows old values
  • Task search doesn’t find with new keywords
  • Task data appears outdated
Possible causes:
  • Task Manager sync delay
  • Variables not configured for Task Manager keywords
  • Task Manager service issues
  • Cache issues
Solutions:
  1. Wait and refresh:
    • Wait 10-30 seconds for sync
    • Refresh Task Manager view
    • Clear browser cache if needed
  2. Verify configuration:
    • Check if variables are configured as task keywords
    • Review Task Manager configuration
    • Confirm variable mapping is correct
  3. Check system health:
    • Verify Task Manager service is running
    • Review sync logs for errors
    • Check Kafka messages if available
  4. Verify save completed:
    • Check audit log for successful save
    • Confirm no error messages appeared
    • Review process instance variables show new values
Symptoms:
  • Search doesn’t return expected results
  • Old values appear in search results
  • Process instance not found with new criteria
Possible causes:
  • Elasticsearch sync delay
  • Variables not configured for data search
  • Index refresh delay
  • Elasticsearch service issues
Solutions:
  1. Wait for index refresh:
    • Elasticsearch typically refreshes within seconds
    • Wait up to 1 minute for propagation
    • Try search again after waiting
  2. Verify configuration:
    • Check if variables are configured for Elasticsearch indexing
    • Review process definition for data search settings
    • Confirm variable paths match index mapping
  3. Check Elasticsearch health:
    • Verify Elasticsearch service is running
    • Check index status
    • Review indexing logs for errors
  4. Manual verification:
    • Query Elasticsearch directly if possible
    • Check index document for process instance
    • Verify field values match expected data
Symptoms:
  • User navigates back in process
  • Variables revert to original values
  • Recent modifications disappeared
This is expected behavior - not a bug.Explanation: Process snapshots used for back navigation are not updated when you modify variables. When users navigate backwards, the process state is restored from the snapshot, reverting to original values.Why this happens:
  • Snapshots preserve historical process state
  • Back navigation restores exact point-in-time state
  • Maintains integrity of back navigation feature
Recommendations:
  1. Document this behavior:
    • Inform operations team about snapshot behavior
    • Add notes to operational procedures
    • Train users on implications
  2. Prevent back navigation:
    • Disable back navigation for nodes after modification
    • Configure “Allow back to this action” appropriately
    • Consider process design implications
  3. Alternative approaches:
    • Modify variables only when back navigation unlikely
    • Re-apply modifications after back navigation if needed
    • Design processes to minimize back navigation needs
  4. User warnings:
    • Display warnings before allowing back navigation
    • Inform users modifications may be lost
    • Provide clear guidance on expected behavior
Symptoms:
  • Parent process variables updated
  • Subprocess still has old values
  • Data inconsistency between parent and child
This is expected behavior - not a bug.Explanation: Subprocesses receive a copy of variables at start time and execute independently. Modifying parent variables doesn’t cascade to existing child subprocesses.Why this happens:
  • Subprocesses have independent execution context
  • Each subprocess operates on its snapshot of data
  • Maintains process isolation and integrity
Solutions:
  1. Understand before modifying:
    • Check if process has active subprocesses
    • Review parent-child data dependencies
    • Assess impact of data divergence
  2. Manual coordination:
    • Modify subprocess variables separately if needed
    • Document intentional data divergence
    • Test subprocess behavior with original values
  3. Process design considerations:
    • Design processes to minimize parent-child data coupling
    • Pass critical data as subprocess parameters
    • Use events for dynamic data synchronization
Symptoms:
  • Integration called before modification
  • Response based on original values
  • Data mismatch between variables and integration result
This is expected behavior - not a bug.Explanation: Integrations are called with specific parameters. External systems process requests independently. Modifying variables after the call doesn’t re-trigger the integration or affect the response.Why this happens:
  • Integration already executed with original parameters
  • External system processed original request
  • Response is based on what was sent
  • Modifying variables is local to process instance
Solutions:
  1. Avoid modifying during integration:
    • Check if token is waiting for integration response
    • Wait for integration to complete before modifying
    • Document which variables were sent to integration
  2. Re-execute if needed:
    • Consider re-executing the integration node
    • Manually trigger integration with new values if possible
    • Design retry mechanisms in process flow
  3. Accept response:
    • Use integration response as-is
    • Modify only non-integration-related variables
    • Document why integration response doesn’t match