Katana ML Skipper
This is a simple and flexible ML workflow engine. It helps to orchestrate events across a set of microservices and create executable flow to handle requests. Engine is designed to be configurable with any microservices. Enjoy!
Engine and Communication parts are generic and can be reused. A group of ML services is provided for sample purposes. You should replace a group of services with your own. The current group of ML services works with Boston Housing data. Data service is fetching Boston Housing data and converts it to the format suitable for TensorFlow model training. Training service builds TensorFlow model. Serving service is scaled to 2 instances and it serves prediction requests.
docker-compose up --build -d
This will start Skipper services and RabbitMQ.
Web API FastAPI endpoint:
NGINX Ingress Controller:
If you are using local Kubernetes setup, install NGINX Ingress Controller
Build Docker images:
docker-compose -f docker-compose-kubernetes.yml build
Setup Kubernetes services:
Skipper API endpoint published through NGINX Ingress (you can setup your own host in /etc/hosts):
Check NGINX Ingress Controller pod name:
kubectl get pods -n ingress-nginx
Sample response, copy the name of ‘Running’ pod:
NAME READY STATUS RESTARTS AGE ingress-nginx-admission-create-dhtcm 0/1 Completed 0 14m ingress-nginx-admission-patch-x8zvw 0/1 Completed 0 14m ingress-nginx-controller-fd7bb8d66-tnb9t 1/1 Running 0 14m
NGINX Ingress Controller logs:
<div class="snippet-clipboard-content position-relative overflow-auto" data-snippet-clipboard-copy-content="kubectl logs -n ingress-nginx -f
kubectl logs -n ingress-nginx -f