Notifications plugin setup
The Notifications plugin is available as a docker image, and it has the following dependencies.
Dependencies
Before setting up the plugin, ensure you have the following dependencies:
- A MongoDB database for storing notification templates and records
- Access to the Kafka instance used by the FlowX.AI Engine
- A Redis instance for caching notification templates
- An S3-compatible file storage solution (e.g., Min.io) if you need to attach documents to notifications
Authorization configuration
Set these variables to connect to your identity management platform:
Environment Variable | Description | Default Value |
---|---|---|
SECURITY_OAUTH2_BASESERVERURL | Base URL of the OAuth2/OIDC server | - |
SECURITY_OAUTH2_CLIENT_CLIENTID | OAuth2 client ID for the Notifications Plugin | - |
SECURITY_OAUTH2_CLIENT_CLIENTSECRET | OAuth2 client secret | - |
SECURITY_OAUTH2_REALM | OAuth2 realm name | - |
MongoDB configuration
The only thing that needs to be configured is the DB access info, the rest will be handled by the plugin.
Environment Variable | Description | Default Value |
---|---|---|
SPRING_DATA_MONGODB_URI | MongoDB connection URI | mongodb://${DB_USERNAME}:${DB_PASSWORD}@mongodb-0.mongodb-headless,mongodb-1.mongodb-headless,mongodb-arbiter-0.mongodb-arbiter-headless:27017/notification-plugin |
DB_USERNAME | Username for runtime MongoDB connection | notification-plugin |
DB_PASSWORD | Password for runtime MongoDB connection | password |
Redis configuration
Configure your Redis caching instance:
Environment Variable | Description | Default Value |
---|---|---|
SPRING_REDIS_HOST | Redis server hostname | localhost |
SPRING_REDIS_PORT | Redis server port | 6379 |
SPRING_REDIS_PASSWORD | Redis server password | defaultpassword |
REDIS_TTL | Time-to-live for Redis cache (milliseconds) | 5000000 |
Kafka configuration
Core Kafka settings
Environment Variable | Description | Default Value |
---|---|---|
SPRING_KAFKA_BOOTSTRAPSERVERS | Address of the Kafka server(s) | localhost:9092 |
SPRING_KAFKA_SECURITYPROTOCOL | Security protocol for Kafka connections | PLAINTEXT |
SPRING_KAFKA_CONSUMERGROUPID | Consumer group identifier | notif123-preview |
KAFKA_MESSAGEMAXBYTES | Maximum message size (bytes) | 52428800 (50MB) |
KAFKA_AUTHEXCEPTIONRETRYINTERVAL | Retry interval after authorization exceptions (seconds) | 10 |
KAFKA_CONSUMER_THREADS | Number of consumer threads | 1 |
Topic naming configuration
Environment Variable | Description | Default Value |
---|---|---|
KAFKA_TOPIC_NAMING_SEPARATOR | Primary separator for topic names | . |
KAFKA_TOPIC_NAMING_SEPARATOR2 | Secondary separator for topic names | - |
KAFKA_TOPIC_NAMING_PACKAGE | Package prefix for topic names | ai.flowx. |
KAFKA_TOPIC_NAMING_ENVIRONMENT | Environment segment for topic names | dev. |
KAFKA_TOPIC_NAMING_VERSION | Version suffix for topic names | .v1 |
Topic configurations
Each action in the service corresponds to a Kafka event on a specific topic. Configure the following topics:
OTP topics
Environment Variable | Description | Default Value |
---|---|---|
KAFKA_TOPIC_OTP_GENERATE_IN | Topic for incoming OTP generation requests | ai.flowx.dev.plugin.notification.trigger.generate.otp.v1 |
KAFKA_TOPIC_OTP_GENERATE_OUT | Topic for OTP generation results | ai.flowx.dev.engine.receive.plugin.notification.generate.otp.results.v1 |
KAFKA_TOPIC_OTP_VALIDATE_IN | Topic for incoming OTP validation requests | ai.flowx.dev.plugin.notification.trigger.validate.otp.v1 |
KAFKA_TOPIC_OTP_VALIDATE_OUT | Topic for OTP validation results | ai.flowx.dev.engine.receive.plugin.notification.validate.otp.results.v1 |
Notification topics
Environment Variable | Description | Default Value |
---|---|---|
KAFKA_TOPIC_NOTIFICATION_INTERNAL_IN | Topic for incoming notification requests | ai.flowx.dev.plugin.notification.trigger.send.notification.v1 |
KAFKA_TOPIC_NOTIFICATION_INTERNAL_OUT | Topic for notification delivery confirmations | ai.flowx.dev.engine.receive.plugin.notification.confirm.send.notification.v1 |
KAFKA_TOPIC_NOTIFICATION_EXTERNAL_OUT | Topic for forwarding notifications to external systems | ai.flowx.dev.plugin.notification.trigger.forward.notification.v1 |
Audit topic
Environment Variable | Description | Default Value |
---|---|---|
KAFKA_TOPIC_AUDIT_OUT | Topic for sending audit logs | ai.flowx.dev.core.trigger.save.audit.v1 |
File storage configuration
Based on use case you can use directly a file system or an S3 compatible cloud storage solution (for example min.io).
The file storage solution can be configured using the following environment variables:
Environment Variable | Description | Default Value |
---|---|---|
APPLICATION_FILESTORAGE_TYPE | Storage type to use (s3 or fileSystem ) | s3 |
APPLICATION_FILESTORAGE_DISKDIRECTORY | Directory for file storage when using filesystem | MS_SVC_NOTIFICATION |
APPLICATION_FILESTORAGE_S3_ENABLED | Enable S3-compatible storage | true |
APPLICATION_FILESTORAGE_S3_SERVERURL | URL of S3-compatible server | http://minio-service:9000 |
APPLICATION_FILESTORAGE_S3_ENCRYPTIONENABLED | Enable server-side encryption | false |
APPLICATION_FILESTORAGE_S3_ACCESSKEY | Access key for S3 | minio |
APPLICATION_FILESTORAGE_S3_SECRETKEY | Secret key for S3 | secret |
APPLICATION_FILESTORAGE_S3_BUCKETPREFIX | Prefix for bucket names | qdevlocal-preview-paperflow |
SMTP setup
Configure SMTP settings for sending email notifications:
Environment Variable | Description | Default Value |
---|---|---|
SIMPLEJAVAMAIL_SMTP_HOST | SMTP server hostname | smtp.gmail.com |
SIMPLEJAVAMAIL_SMTP_PORT | SMTP server port | 587 |
SIMPLEJAVAMAIL_SMTP_USERNAME | SMTP server username | notification.test@flowx.ai |
SIMPLEJAVAMAIL_SMTP_PASSWORD | SMTP server password | paswword |
SIMPLEJAVAMAIL_TRANSPORTSTRATEGY | Email transport strategy (e.g., SMTP , EXTERNAL_FORWARD ) | SMTP |
APPLICATION_MAIL_FROM_EMAIL | Default sender email address | notification.test@flowx.ai |
APPLICATION_MAIL_FROM_NAME | Default sender name | Notification Test |
Email attachments configuration
Configure handling of email attachments:
Environment Variable | Description | Default Value |
---|---|---|
SPRING_HTTP_MULTIPART_MAXFILESIZE | Maximum file size for attachments | 15MB |
SPRING_HTTP_MULTIPART_MAXREQUESTSIZE | Maximum request size for multipart uploads | 15MB |
OTP configuration
Configure One-Time Password generation and validation:
Environment Variable | Description | Default Value |
---|---|---|
FLOWX_OTP_LENGTH | Number of characters in generated OTPs | 4 |
FLOWX_OTP_EXPIRETIMEINSECONDS | Expiry time for OTPs (seconds) | 6000 (10 minutes) |
Logging configuration
Control logging levels for different components:
Environment Variable | Description | Default Value |
---|---|---|
LOGGING_LEVEL_ROOT | Root logging level | - |
LOGGING_LEVEL_APP | Application-specific log level | DEBUG |
LOGGING_LEVEL_MONGO_DRIVER | MongoDB driver log level | INFO |
LOGGING_LEVEL_THYMELEAF | Thymeleaf template engine log level | INFO |
LOGGING_LEVEL_FCM_CLIENT | Firebase Cloud Messaging client log level | OFF |
LOGGING_LEVEL_REDIS | Redis/Lettuce client log level | OFF |
Usage notes
Topic naming convention
Topics follow a standardized naming convention:
- Example:
ai.flowx.dev.plugin.notification.trigger.generate.otp.v1
- Structure:
{package}.{environment}.{component}.{action}.{subject}.{version}
Consumer error handling
When KAFKA_CONSUMER_ERRORHANDLING_ENABLED
is set to true
:
- The application will retry processing failed messages according to
KAFKA_CONSUMER_ERRORHANDLING_RETRIES
- Between retries, the application will wait for the duration specified by
KAFKA_CONSUMER_ERRORHANDLING_RETRYINTERVAL
For example, if KAFKA_CONSUMER_ERRORHANDLING_RETRYINTERVAL
is set to 5000 (5 seconds) and KAFKA_CONSUMER_ERROR_HANDLING_RETRIES
is set to 5, the consumer application will make up to 5 attempts, waiting 5 seconds between each attempt.
Message size configuration
The KAFKA_MESSAGE_MAX_BYTES
setting affects multiple Kafka properties:
spring.kafka.producer.properties.message.max.bytes
spring.kafka.producer.properties.max.request.size
spring.kafka.consumer.properties.max.partition.fetch.bytes
OAuth authentication
When using the ‘kafka-auth’ profile, the security protocol changes to ‘SASL_PLAINTEXT’ and requires OAuth configuration via the KAFKA_OAUTH_*
variables.
Was this page helpful?