OctaiPipe Images#

To run OctaiPipe, you will need to use our different Docker images. This page explains what images are available and when to use each of them.

OctaiPipe edge client image#

The edge client image is a light-weight client which is deployed on each device. The image name is autogenerated in your docker compose files during Register a Device.

The image name is follows the pattern: octaipipe/octaipipe_manager:{tag}

  • {tag} is usually the OctaiPipe minor version, e.g. 2.2 or 2.3

If you want to run this with root access, edit the image in the docker compose file to follow the pattern octaipipe/octaipipe_manager-root:{tag}.

Custom Jupyter image#

The custom Jupyter images are used to spin up a Jupyter instance in Kubeflow with OctaiPipe and related environment pre-installed.

The images follow the pattern: octaipipe.azurecr.io/custom_jupyter_img-{NN_pkg}:{tag}

  • {tag} is the OctaiPipe version, e.g. 2, 2.2, or 2.2.1

  • {NN_pkg} is the Neural Network package to use. Either tf or torch. Just custom_jupyter_img with no dash or NN-pkg will get an image with both TF and torch

OctaiPipe library images#

The OctaiPipe library images are used when deploying OctaiPipe pipelines and training workloads to edge devices or Kubernetes.

The images follow the pattern: octaipipe.azurecr.io/{octaipipe_package}-{data_library}:{tag}

  • {octaipipe_package} is either octaipipe, octaipipe_lite, or octaipipe_core

  • {data_library} is either sql, mqtt, influxdb, or all_data_loaders

  • {tag} is the OctaiPipe version, e.g. 2, 2.2, or 2.2.1

If you want to run this with root access, the image should take the following pattern: octaipipe.azurecr.io/{octaipipe_package}-{data_library}-root:{tag}.

FL server image#

The FL server image spins up in Kubernetes when an FL workload is run. This handles orchestrating and aggregating federated model training.

The images follow the pattern: octaipipe.azurecr.io/fl_server-{model_type}:{tag}

  • {tag} is the OctaiPipe version, e.g. 2, 2.2, or 2.2.1.

  • {model_type} is the model type to use. Either torch or xgboost. Just fl_server with no dash or model_type will get an image with both packages. Running on Android with Tensorflow can use any fl_server image

OctaiOxide image#

OctaiOxide images run OctaiPipe in Rust through WASM. This is a very lightweight version of OctaiPipe for low-powered devices. This exists for OctaiPipe 2.2 and later.

The images follow the pattern: octaipipe.azurecr.io/octaioxide:{tag}

  • {tag} is the OctaiPipe version, e.g. 2, 2.2, or 2.2.1.