Edge Labelling#
When running classification models in OctaiPipe, the models by default will return the predictions as numbers. A model that classifies failure states such as “Normal”, “Failure”, and “Other” will have the classes converted to numbers such as 0, 1, and 2 in training, and return those in inference.
However, users might actually want model inference to return the original text labels and save those for further consumption.
In addition, an outcome of running unsupervised clustering algorithms such as k-FED Clustering and furhter analysis and evaluation of the created clusters using Evaluating k-FED might be to assign labels to the clusters created.
Therefore, OctaiPipe has some functionality for assigning and working with labels of models. This guide goes through how to assign labels to models, and getting those labels returned when running model inference.
Assigning labels for models#
Working with labels for existing models makes use of the explainability
module in
the Python Interface.
To add labels to an existing model, we use the add_labels_for_model
function. The
function takes the model_id
as an input as well as a dictionary specifying the numerical
labels (as strings) and the intended labels for them. An example for running this is below:
1from octaipipe import explainability
2
3model_id = 'YOUR_MODEL_ID_HERE'
4
5labels = {
6 '0': 'Normal',
7 '1': 'Failure',
8 '2': 'Other'
9}
10
11explainability.add_labels_for_model(model_id=model_id, labels=labels)
It is also possible to update labels for a model using a similar system, by running update_labels_for_model
:
1from octaipipe import explainability
2
3model_id = 'YOUR_MODEL_ID_HERE'
4
5labels = {
6 '2': 'Some_New_Label'
7}
8
9explainability.update_labels_for_model(model_id=model_id, labels=labels)
In order to view all labels assigned for a model, run get_labels_for_model
:
1from octaipipe import explainability
2
3model_id = 'YOUR_MODEL_ID_HERE'
4
5labels = explainability.get_labels_for_model(model_id=model_id)
6print(labels)
If you wish to delete the labels set for a device, run delete_labels_for_model
:
1from octaipipe import explainability
2
3model_id = 'YOUR_MODEL_ID_HERE'
4
5explainability.delete_labels_for_model(model_id=model_id)
Getting labels during model inference#
In order to get labels instead of numerical predictions during model inference, set
the argument convert_predictions_to_labels
to True
in the run_specs
of your
inference config. See an example inference config file below:
run_specs:
onnx_pred: false
prediction_period: 30s
convert_predictions_to_labels: True
This converts the model predictions to the labels set by the user for the model. If a prediction does not have a label assigned to it, it will return the original prediction as a string.
If the original predictions are [0, 1, 2, 0, 1, 2]
and the labels set for the model are
{'0': 'Normal', '1': 'Failure'}
, the converted predictions would be ['Normal', 'Failure', '2', 'Normal', 'Failure', '2']
.