The plugin is available as a docker image.
SECURITY_OAUTH2_BASE_SERVER_URL
SECURITY_OAUTH2_CLIENT_CLIENT_ID
SECURITY_OAUTH2_CLIENT_CLIENT_SECRET
SECURITY_OAUTH2_REALM
SECURITY_TYPE
: Indicates that OAuth 2.0 is the chosen security type, default value: oauth2
.SECURITY_PATHAUTHORIZATIONS_0_PATH
: Defines a security path or endpoint pattern. In this case, it specifies that the security settings apply to all paths under the “/api/” path. The **
is a wildcard that means it includes all subpaths under “/api/**”.SECURITY_PATHAUTHORIZATIONS_0_ROLESALLOWED
: Specifies the roles allowed for accessing the specified path. In this case, the roles allowed are empty (""). This might imply that access to the “/api/**” paths is open to all users or that no specific roles are required for authorization.SECURITY_OAUTH2_BASE_SERVER_URL
: This setting specifies the base URL of the OpenID server, which is used for authentication and authorization.SECURITY_OAUTH2_SERVICE_ACCOUNT_ADMIN_CLIENT_ID
: The task management service account is utilized to facilitate process initiation, enable the use of the task management plugin (requiring the FLOWX_ROLE
and role mapper), and access data from Keycloak.SECURITY_OAUTH2_SERVICE_ACCOUNT_ADMIN_CLIENT_SECRET
: Along with the client ID, you must also specify the client secret associated with the service account for proper authentication.SPRING_DATASOURCE_URL
SPRING_DATASOURCE_USERNAME
SPRING_DATASOURCE_PASSWORD
SPRING_DATA_MONGODB_URI
- the uri for the mongodb databaseSPRING_REDIS_HOST
SPRING_REDIS_PASSWORD
REDIS_TTL
SPRING_KAFKA_BOOTSTRAP_SERVERS
- address of the Kafka serverSPRING_KAFKA_CONSUMER_GROUP_ID
- group of consumersKAFKA_CONSUMER_THREADS
- the number of Kafka consumer threadsKAFKA_CONSUMER_EXCLUDE_USERS_THREADS
-KAFKA_AUTH_EXCEPTION_RETRY_INTERVAL
- the interval between retries after AuthorizationException
is thrown by KafkaConsumer
KAFKA_MESSAGE_MAX_BYTES
- this is the largest size of the message that can be received by the broker from a producer.KAFKA_TOPIC_PROCESS_START_OUT
- is used for running hooks, the engine receives a start process request for a hook on this topic, and it needs to be matched with the corresponding ...start_in
topic on the engine sideKAFKA_TOPIC_PROCESS_OPERATIONS_OUT
- is used to update the engine on task manager operations such as assignment, unassignment, hold, unhold and terminate it is matched with the ...operations_in
topic on the engine sideKAFKA_TOPIC_PROCESS_SCHEDULE_IN
- is used to receive a message from the task manager when it’s time to run a hook (for hooks configured with SLA, for more details on how to configure a hook with SLA, click here)KAFKA_TOPIC_PROCESS_SCHEDULE_OUT_SET
- sends a message to the scheduler to set hooks or exclude users from automatic assignment when they are assigned to out of office feature, it needs to be matched with the configuration in the schedulerKAFKA_TOPIC_PROCESS_SCHEDULE_OUT_STOP
- ends a message to the scheduler to stop the schedule for the above actions. It needs to be matched with the configuration in the schedulerKAFKA_TOPIC_EXCLUDE_USERS_SCHEDULE_IN
- is used to receive a message from the scheduler when users need to be excludedKAFKA_TOPIC_TASK_IN
- used to receive a message from the engine to start a new task. It needs to be matched with the corresponding task_out topic on the engine side.KAFKA_TOPIC_EVENTS_GATEWAY_OUT_MESSAGE
- outgoing messages for Events GatewayLOGGING_LEVEL_ROOT
- root spring boot microservice logsLOGGING_LEVEL_APP
- app level logsLOGGING_LEVEL_MONGO_DRIVER
- MongoDB driver logsFLOWX_ALLOW_USERNAME_SEARCH_PARTIAL
- filter possible assignees by partial names (default: true)