Skip to main content
Available starting with FlowX.AI 5.5.0The AI Platform is a suite of microservices (Java and Python) that power config-time AI agents, business agents, knowledge management, and conversational AI capabilities.

Overview

The AI Platform consists of three layers:
  • Java services — Core platform services handling data, orchestration, and storage (gRPC + GraphQL)
  • Python services — AI agent services for code generation, analysis, design, and knowledge processing (REST + gRPC)
  • Event-driven workers — Background services consuming Kafka topics for indexing, OCR, and replication
All inter-service communication uses gRPC with Protobuf contracts, except the config-time agents (AI Developer, AI Analyst, AI Designer) and Agent Builder which expose REST endpoints.

Infrastructure requirements

DGraph

Graph database for knowledge storage. Requires 3 Alpha + 3 Zero nodes for HA.

Qdrant

Vector database for embeddings. Cluster mode recommended for production.

S3-compatible storage

Object storage for binaries and files. Any S3-compatible provider works (MinIO, AWS S3, etc.).

Kafka

Message broker for event-driven communication. KRaft mode supported.

Keycloak

Identity provider for OAuth2 authentication across all services.

SpiceDB

Fine-grained authorization system for access control.

Service architecture

Java services

ServiceDefault PortProtocolPurpose
Connected Graph9100GraphQLAPI gateway and service orchestrator
Agents9101gRPCAgent lifecycle management
Binaries9102gRPCFile and binary artifact storage
Conversations9103gRPCConversation management
Tenants9105gRPCMulti-tenant management
Knowledge Graph (KAG)9106gRPCKnowledge graph ingestion
MCP9108gRPCModel Context Protocol integration

Python services

ServiceDefault PortProtocolPurpose
Planner9150gRPCIntent understanding and task orchestration
AI Developer9151RESTCode generation (config-time agent)
AI Analyst9152RESTProcess analysis (config-time agent)
AI Designer9153RESTUI generation (config-time agent)
Agent Builder9154RESTBusiness agent builder
Knowledgebase RAG9155gRPCRetrieval-augmented generation
Embedder9156gRPCEmbedding generation
Knowledgebase9109gRPCKnowledge base operations
Speech-to-Text9998RESTAudio transcription and text-to-speech (5.7+)

Event-driven workers

These services have no exposed ports and consume from Kafka topics:
ServiceTriggerPurpose
Knowledgebase Indexer v2ai.flowx.ai-platform.internal.binaries.lifecycleDocument vector indexing
OCRai.flowx.ai-platform.internal.ocr.commandsDocument text extraction
Tenants Replicatorai.flowx.organization.events.v1Organization event replication
In production Kubernetes deployments, all services default to port 9100 via the SERVICE_PORT variable. The ports listed above are the defaults for local development with Docker Compose.

Environment variables

These variables control how services locate each other within the cluster:
Environment VariableDescriptionDefault Value
GRPC_HOST_RESOLVERService discovery methodk8s
GRPC_HOST_RESOLVER_HELM_CHARTHelm chart name for K8s service resolution
GRPC_HOST_RESOLVER_FIXED_IPFixed IP/hostname when using host resolverai-platform
SERVICE_PORTPort the service listens on9100 (production)
Kubernetes deployment:
GRPC_HOST_RESOLVER=k8s
GRPC_HOST_RESOLVER_HELM_CHART=ai-platform
Docker Compose / local deployment:
GRPC_HOST_RESOLVER=host
GRPC_HOST_RESOLVER_FIXED_IP=localhost

Agent Builder configuration

Environment VariableDescriptionDefault Value
AGENT_BUILDER_MAX_TOOL_CALLSMaximum number of tool calls an agent can make in a single workflow execution20

Kafka topics

The AI Platform uses the following internal Kafka topics:
TopicPartitionsPurpose
ai.flowx.ai-platform.internal.binaries.lifecycle10Binary upload events triggering indexing
ai.flowx.organization.events.v110Organization lifecycle events for tenant replication
ai.flowx.ai-platform.internal.ocr.commands10OCR processing commands
ai.flowx.ai-platform.internal.ocr.progress10OCR processing progress updates
For production environments, create these topics manually with appropriate replication factors. For development, Kafka auto-topic creation handles them automatically.

Deployment

The AI Platform ships as an umbrella Helm chart aggregating all microservices and infrastructure dependencies.Install or upgrade:
helm dependency update deployment/helm/ai-platform
helm upgrade --install ai-platform deployment/helm/ai-platform \
  --set global.aiPlatformVersion=<version>
After deployment, initialize the platform:
make initialize-platform
make initialize-knowledge-graph

Key Helm values

Replica counts:
ServiceDefault Replicas
Connected Graph1
Knowledge Graph2
Agents2
Tenants2
Planner2
AI Developer2
AI Analyst2
AI Designer2
Agent Builder2
Global configuration:
global:
  flowx:
    idp:
      provider: keycloak
      keycloak:
        hostname: <your-keycloak-host>
        realm: <your-realm>
  telemetry:
    prometheus: enabled
    otelCollector: enabled

Storage requirements

ComponentDefault Persistent VolumeNotes
DGraph Alpha30Gi per node3 nodes in HA mode
DGraph ZeroEphemeralConsensus nodes, no persistent data
Qdrant data30GiVector embeddings
Qdrant snapshots30GiBackup snapshots
MinIODistributed across 4 nodesConfigured per deployment
Kafka1Gi minimumAdjust based on message volume

Troubleshooting

Kubernetes DNS resolution:
# Verify AI Platform services are running
kubectl get services -l app=ai-platform

# Check Helm deployment
helm list -n ai-platform

# Test DNS resolution
nslookup ai-platform-ai-conversations.default.svc.cluster.local
Common causes:
  • Incorrect GRPC_HOST_RESOLVER_HELM_CHART value
  • Services not in the same namespace
  • DNS not resolving due to CoreDNS issues
DGraph health check:
curl http://<dgraph-alpha-host>:8080/health
Qdrant health check:
curl http://<qdrant-host>:6333/healthz
Common causes:
  • Incorrect DGRAPH_CONNECTION_GRPC_ENDPOINT (must include all Alpha nodes for HA)
  • Missing QDRANT_CONNECTION_API_KEY
  • Qdrant cluster not fully initialized
Verify broker availability:
kafka-broker-api-versions --bootstrap-server <kafka-host>:9092
Verify topics exist:
kafka-topics --bootstrap-server <kafka-host>:9092 --list | grep ai-platform
Common causes:
  • Wrong KAFKA_BOOTSTRAP_SERVERS address
  • Topics not auto-created and not manually provisioned
  • Security mode mismatch (KAFKA_SECURITY_MODE)
Verify Keycloak connectivity:
curl https://<keycloak-host>/auth/realms/<realm>/.well-known/openid-configuration
Common causes:
  • Incorrect SECURITY_OAUTH2_BASE_SERVER_URL
  • Realm name mismatch
  • Client ID not registered in Keycloak
  • SpiceDB token expired or misconfigured
If AI nodes fail with model-related errors:
  • Verify that an AI provider is configured at Organization SettingsModel Providers with a successful connection test
  • Check that models are enabled in the provider’s whitelist
  • Verify that workspace-level model assignments are set for the relevant AI capability (text generation, image understanding, embeddings, document/OCR)
  • Ensure FLOWX_ORG_MANAGER_URL is set on all Python AI services and points to a reachable Organization Manager instance
See the AI providers and model configuration page for setup details.

AI in FlowX

Overview of config-time and business AI agents

Agent Builder

Build custom AI agents with the no-code agent builder

Deployment guidelines v5.5

Component versions and upgrade instructions
Last modified on April 9, 2026