Customer management setup
The Customer management plugin is available as a docker image so we need to configure.
Infrastructure Prerequisites
Elastic Search
In order to install Elasticsearch instance Elastic Cloud on Kubernetes (ECK) can be used.
Use ECK quickstart to deploy CRDs and create Elasticsearch instances:
Elasticsearch instance:
(Optional) Kibana instance:
The index used by customer management plugin should be created.
Postgres database
This plugin can work without this database, it will not store the audit data.
Basic Postgres configuration
Configuration
Authorization configuration
The following variables need to be set in order to connect to the identity management platform:
SECURITY_OAUTH2_BASE_SERVER_URL
SECURITY_OAUTH2_CLIENT_CLIENT_ID
SECURITY_OAUTH2_REALM
Datasource configuration
To store audit for searches this plugins use a postgres database.
The following configuration details need to be added using environment variables:
SPRING_DATASOURCE_URL
SPRING_DATASOURCE_USERNAME
SPRING_DATASOURCE_PASSWORD
You will need to make sure that the user, password, connection link and db name are configured correctly, otherwise you will receive errors at start time.
If you are going to use a database to store the audit, you can use the built-in script to maintain the database schema.
Elastic search configuration
The connection to the Elasticsearch cluster is done over https using the Elasticsearch api. To connect to the it you will need to configure the connection details and index use to store customers.
Kafka configuration
The following Kafka related configurations can be set by using environment variables:
SPRING_KAFKA_BOOTSTRAP_SERVERS
- address of the Kafka serverSPRING_KAFKA_CONSUMER_GROUP_ID
- group of consumersKAFKA_CONSUMER_THREADS
- the number of Kafka consumer threadsKAFKA_AUTH_EXCEPTION_RETRY_INTERVAL
- the interval between retries afterAuthorizationException
is thrown byKafkaConsumer
KAFKA_MESSAGE_MAX_BYTES
- this is the largest size of the message that can be received by the broker from a producer.
Each action available in the service corresponds to a Kafka event. A separate Kafka topic must be configured for each use-case.
The Engine is listening for messages on topics with names of a certain pattern, make sure to use correct outgoing topic names when configuring the documents plugin.
Needed topics:
KAFKA_TOPIC_CUSTOMER_SEARCH_IN
KAFKA_TOPIC_CUSTOMER_SEARCH_OUT
In order to match a request made to the customer management plugin, the engine will have to send the process id on a Kafka header.
Logging
The following environment variables could be set in order to control log levels:
LOGGING_LEVEL_ROOT
- root spring boot microservice logsLOGGING_LEVEL_APP
- app level logs
Was this page helpful?