Skip to main content
Available starting with FlowX.AI 5.7.0ModPod is a lightweight ML model serving microservice. It exposes a REST API for uploading, managing, and running inference on LightGBM and scikit-learn models, with built-in SHAP explainability.
ModPod is a standalone Python service. It does not depend on Kafka, Keycloak, or object storage — model artifacts and the model registry live on a local persistent volume.

Dependencies

ModPod is self-contained. It does not require any other FlowX infrastructure service to run.
  • Persistent volume — required for model artifacts and the SQLite registry
  • Identity provider — not used; authentication is a static bearer token
  • Kafka — not used in 5.7.0
  • Object storage (S3/MinIO) — not used; models are stored on the mounted volume

Capabilities

  • Upload model packages as ZIP archives with a metadata.json manifest
  • Manage model registry (list, fetch, delete by name/version)
  • Load models into memory on demand
  • Run inference with SHAP-based top-factor explanations
  • Support for LightGBM (Booster text format) and Logistic Regression (scikit-learn, joblib-serialized)

Authentication

ModPod uses a static bearer token. Set the AUTHORIZATION_APIKEY environment variable and include the token on every request except the health check:
Authorization: Bearer <AUTHORIZATION_APIKEY>
The API key is a static secret. Store it in a Kubernetes Secret (or equivalent) and never commit it to source control.

API endpoints

All endpoints are prefixed with /modpod.

Health

MethodPathAuthDescription
GET/modpod/info/healthcheckNoneService health check

Model upload

MethodPathAuthDescription
POST/modpod/uploadBearerUpload a ZIP archive containing model files and metadata.json
The ZIP must contain a metadata.json file with name and version fields. Files are extracted to <MODPOD_MODELS_DIR>/<name>/<version>/.

Model registry

MethodPathAuthDescription
GET/modpod/modelsBearerList all registered models
GET/modpod/models/{name}BearerGet model details and versions
GET/modpod/models/{name}/versionsBearerList versions for a given model
DELETE/modpod/models/{name}/{version}BearerDelete a specific model version
DELETE/modpod/modelsBearerDelete all models

Inference

MethodPathAuthDescription
POST/modpod/inference/loadBearerLoad a model into memory
POST/modpod/inference/predictBearerRun prediction on a loaded model

Predict request

{
  "model_name": "call-propensity",
  "model_version": "1.0",
  "model_type": "propensity",
  "input_data": {
    "feature1": 0.5,
    "feature2": "category_a",
    "feature3": 100
  }
}

Predict response

{
  "output_data": {
    "model": "call-propensity:1.0",
    "input": { "...": "..." },
    "result": {
      "prediction": 0.73,
      "top_factors": "feature1=0.50 (avg=0.30); feature2=category_a"
    }
  }
}

Supported model types

TypeFormatFile name
LightGBMBooster text formatencoders/model-{type}.txt
Logistic Regressionjoblib-serialized scikit-learn modelencoders/model-{type}.joblib
Both types produce SHAP-based top-factor explanations alongside the prediction. LightGBM uses the built-in pred_contrib output; Logistic Regression uses shap.LinearExplainer.

Model package structure

Model packages are uploaded as ZIP archives with the following layout:
model.zip/
├── metadata.json
└── encoders/
    ├── model-{type}.txt         # LightGBM model (or .joblib for Logistic Regression)
    ├── {type}-feature-context.csv  # Feature metadata (mean, direction, type)
    ├── standard_scaler.pkl      # StandardScaler (Logistic Regression only)
    ├── shap_background.pkl      # SHAP background data (Logistic Regression only)
    └── {feature}-classes.pkl    # LabelEncoder per categorical feature (LightGBM)
The metadata.json file must contain at least name and version:
{
  "name": "call-propensity",
  "version": "1.0",
  "description": "Propensity-to-call scoring model"
}

Environment variables

Authentication

Environment VariableDescriptionDefault Value
AUTHORIZATION_APIKEYBearer token required on all authenticated endpoints.- (required)

Storage

Environment VariableDescriptionDefault Value
MODPOD_MODELS_DIRDirectory where uploaded model artifacts are stored. Mount a persistent volume here../models
MODPOD_DATABASE_URLDatabase URL for the model registry. SQLite is used by default.sqlite:///modpod.db
MODPOD_DB_PATHSQLite file path. Used only when MODPOD_DATABASE_URL is not set../modpod.db

Server

Environment VariableDescriptionDefault Value
GUNICORN_WORKERSNumber of Gunicorn worker processes.1
GUNICORN_TIMEOUTRequest timeout in seconds.600
Recommended worker count is 1. Inference is CPU-bound and loaded models are held in per-process memory; additional workers multiply the memory footprint without increasing throughput for a single model.

CORS

Environment VariableDescriptionDefault Value
APPLICATION_CORS_ALLOW_ORIGINComma-separated list of allowed origins. Set to the frontend origin(s) that call ModPod.-
APPLICATION_CORS_ALLOW_HEADERSComma-separated list of allowed request headers.Standard FlowX headers
APPLICATION_CORS_ALLOW_METHODSComma-separated list of allowed HTTP methods.GET,PUT,POST,DELETE,PATCH,OPTIONS

Deployment

ModPod is packaged as a Docker image based on python:3.13.3-slim-bookworm and runs as a non-root user.
SettingValue
Container port8080
Health check path/modpod/info/healthcheck
Models directory (in container)/code/models
SQLite database (in container)/code/modpod.db

Resource recommendations

ResourceRecommendation
Memory512 MB – 1 GB, depending on the number and size of loaded models
CPU1 vCPU (inference is CPU-bound)
Gunicorn workers1
Persistent volumeMinimum 5Gi for /code/models (grows with the number of models)
The SQLite registry is file-based and stored on the same volume as model artifacts. To scale ModPod horizontally, switch to an external database by setting MODPOD_DATABASE_URL to a shared database URL.

Validation

Health check returns 200 OK:
curl http://<modpod-host>/modpod/info/healthcheck
Authenticated listing returns an empty list (fresh install) or your registered models:
curl -H "Authorization: Bearer $AUTHORIZATION_APIKEY" \
  http://<modpod-host>/modpod/models

Ingress configuration

Routing and CORS for FlowX services

Setup guides overview

Full list of FlowX service setup guides
Last modified on April 24, 2026