This repo contains a Kubernetes Operator based on the kopf and kubernetes Python packages that is used by the Informatics Matters Squonk2 Data Manager API to create transient Jobs (Kubernetes Pods) for the Data Manager service.
Prerequisites: -
- Python
- Docker
- A kubernetes config file
- A compatible Kubernetes (e.g. 1.32 thru 1.34 if the operator is built for 1.33)
The project uses: -
- pre-commit to enforce linting of files prior to committing them to the upstream repository
- Commitizen to enforce a Conventional Commit commit message format
- Black as a code formatter
You MUST comply with these choices in order to contribute to the project.
To get started review the pre-commit utility and the conventional commit style and then set-up your local clone by following the Installation and Quick Start sections: -
pip install -r build-requirements.txt
pre-commit install -t commit-msg -t pre-commit
Now the project's rules will run on every commit, and you can check the current health of your clone with: -
pre-commit run --all-files
Pre-requisites: -
- Docker Compose (v2)
The operator container, residing in the operator
directory,
is automatically built and pushed to Docker Hub using GitHub Actions.
You can build the image yourself using docker compose. The following will build and push an operator image with a specific tag: -
export IMAGE_TAG=34.0.0-alpha.1
docker compose build
docker compose push
We adopt a different approach for operator naming. At the time of writing we were on version 33 and major changes do not result in changes to this number. Why?
The major revision is actually used to identify the Kubernetes 1.x release the
operator is built against. So the 33.x.x
operator is built using
the Python 33.x Kubernetes package.
See the
kubernetes
package version inoperator/requrements.txt
.
When we make feature changes we update the minor value
and for bug-fixes we adjust the patch value. So, for a build against
Kubernetes 1.33 our major version will always be 33
.
We use Ansible and is done via a suitable Python environment using the requirements in the root of the project...
python -m venv venv
source venv/bin/activate
pip install --upgrade pip
pip install -r requirements.txt
Set your KUBECONFIG for the cluster and verify it's as expected by listing the nodes: -
export KUBECONFIG=~/k8s-config/config-local
kubectl get no
[...]
Now, create a parameter file (i.e. parameters.yaml
) based on the project's
parameters-template.yaml
, setting values for the operator that match your
needs. Then deploy, using Ansible, from the root of the project: -
PARAMS=parameters
ansible-playbook -e @${PARAMS}.yaml site.yaml
To remove the operator (assuming there are no operator-derived instances)...
ansible-playbook -e @${PARAMS}.yaml -e jo_state=absent site.yaml
The current Data Manager API assumes that once an Application (operator) has been installed it is not removed. So, removing the operator here is described simply to illustrate a 'clean-up' - you would not normally remove an Application operator in a production environment.