Android Inference Step#
From OctaiPipe version 2.3 onwards we have introduced model management and image based Model Inference support for Android devices:
Lightweight and high-performance Rust based native Android library.
Simple setup to integrate OctaiPipe into your Android application.
Edge based inference allows predictions to be made even when offline.
Secure and private without the need for data to leave your device.
Currently supports image based models, with support for other data types coming soon.
The best way to understand how to Deploy Android Inference models to your devices is to run the notebook tutorial in our Jupyter image: Tutorials/06_Android-deployment/android-deployment.
Integrating OctaiPipe into your Android App#
Our library is named OctaioxideLib
and is contained within liboctaioxide.so
.
The
liboctaioxide.so
library file should be placed inmain/jniLibs/arm64-v8a/
within your Android project.Create a new class called
OctaioxideLib
in your Java package:com.example.yourapp/OctaioxideLib
. This will load our library and expose our native functions.The
OctaioxideLib
class needs be defined as follows, example in Kotlin:
OctaioxideLib.kt
package com.example.yourapp
object OctaioxideLib {
init {
System.loadLibrary("octaioxide")
}
external fun initialiseDevice(portalEndpoint: String, deviceId: String, oneTimeKey: String): String
external fun runEdgeClient(): String
external fun stopEdgeClient(): String
external fun imageInference(imagePath: String): String
}
You’ll need to set an environment variable OCTAIPIPE_VOLUME_PATH
in your app, it should be set to a path where your app has write access.
All OctaiPipe related files including credentials, configurations and models will be stored here.
In this example we are setting the path to the device’s Documents directory: /storage/emulated/0/Android/data/<app>/files/Documents
MainActivity.kt
val octaipipeVolumePath = getExternalFilesDir(Environment.DIRECTORY_DOCUMENTS).toString()
setenv(
"OCTAIPIPE_VOLUME_PATH",
octaipipeVolumePath,
true
)
External functions#
In order to run inference on your Android device, you will need to call external functions defined in your OctaioxideLib
class, these are defined in the functions section below.
Responses#
When calling external functions, the responses follow a specific JSON structure, which includes the following fields:
result: Indicates the outcome of the operation. A value of “0” represents success and “1” is an error.
- content: Contains detailed information about the result.
message: Provides a human-readable explanation of the result.
code (in error responses only): Specifies the error code.
Example success response:
{
"result": "0",
"content": {
"message": "Device initialised and authenticated successfully"
}
}
Example error response:
{
"result": "1",
"content": {
"code": "4",
"message": "One time key already used"
}
}
Error Codes#
The following is a list of possible error codes and their descriptions. The specific error message return will give more fine-grained information about the error.
Error Code |
Name |
Description |
---|---|---|
1 |
InitialiseDevice |
A generic error occurred during device initialisation. Check the specific error message for more details. |
2 |
PortalEndpoint |
An issue with the provided portal endpoint. This could be due to an incorrect URL or connectivity issues with the Portal. |
3 |
DeviceId |
A problem with the device ID. The device ID might be incorrect or not registered in the Portal. |
4 |
OneTimeKey |
An issue with the one-time key used for device initialisation. This key might have already been used or is invalid. |
5 |
ImageInference |
An error occurred during image inference. This could be due to an invalid image path, unsupported image format, or issues with the inference model. Check the specific error message for more details. |
6 |
Authentication |
Indicates an authentication failure. This may occur if the device credentials are incorrect or expired. |
7 |
Timeout |
A timeout occurred during a network operation or inference process. This could be due to network instability or a long-running process exceeding the allowed time. |
8 |
EdgeClient |
Indicates an error related to the EdgeClient. This could be due to failure to start or stop the EdgeClient process. |
9 |
FileAccessPermissionDenied |
The library does not have the necessary permissions to access or modify a required file or directory. This can occur if the OCTAIPIPE_VOLUME_PATH is not set correctly or the app does not have the necessary permissions to access the specified path. |
The error codes should help diagnosing issues during the operation of the library so that they can be handled appropriately in your app.
Functions#
OctaioxideLib.initialiseDevice(portalEndpoint: String, deviceId: String, oneTimeKey: String): String
This function is used to initialise the device with credentials you can obtain by registering the device in your OctaiPipe Portal. Before these three values are set, the device will be unable to authenticate with the Portal.
- Arguments:
portalEndpoint
: The URL of your OctaiPipe Portal, this is unique to your organisation.deviceId
: A unique identifier for your device which you can choose, it must be the same as the device id used when registering this device in the Portal.oneTimeKey
: The one time key obtained when registering your device in the Portal, this can only be used once for device initialisation. If you need to reinitialise the device you will need to generate a new one time key.
- Returns:
- A JSON formatted string indicating success or failure
Success response:
{ "result": "0", "content": { "message": "Device initialised and authenticated successfully" } }
Example error response:
{ "result": "1", "content": { "code": "4", "message": "One time key already used" } }
OctaioxideLib.runEdgeClient()
This is an asynchronous function that runs the OctaiPipe Edge Client in the background. It can be called as a one-off operation when your app is launched, however does not need to be launched as a background process.
OctaioxideLib.runEdgeClient()
Communicates with the Portal, pinging it periodically to confirm that the device is online.
Checks for any new model deployments, automatically downloads, configures and sets the new model as active.
- Returns:
- A JSON formatted string indicating success or failure to start the Edge Client.
Example error response:
{ "result": "1", "content": { "code": "8", "message": "Edge client failed to start" } }
Example success response:
{ "result": "0", "content": { "message": "Edge client started successfully" } } }
OctaioxideLib.stopEdgeClient(): String
The Edge Client process can be stopped by calling this function. For normal operations this is not required as it runs efficiently in the background as long as the app is running.
Note: If the Edge Client is stopped you will not be able to pull new deployments to the handset, however inference can still be run using the local active model.
If you need to start the Edge Client again, you will need to call
runEdgeClient()
.- Returns:
- A JSON formatted string indicating success or failure to stop the Edge Client.
Example error response:
{ "result": "1", "content": { "code": "7", "message": "Edge client failed to stop" } }
Example success response:
{ "result": "0", "content": { "message": "Edge client stopped successfully" } }
OctaioxideLib.imageInference(imagePath: String): String
Runs image inference using the active model.
- Arguments:
imagePath
: The path to the image, your app must have read access to this file.
- Returns:
- A JSON formatted string of prediction results. The JSON format will depend on the model and post processing steps you are using. Example format using Xception model:
Example error response:
{ "result": "1", "content": { "code": "5", "message": "Image Inference doesn't appear to have been deployed" } }
Example success response:
{ "result": "0", "content": { "prediction": { "artificial_photo": 0.5896124839782715, "fire_extinguisher": 0.5364111661911011, "inspection_tag": 0.2570688724517822, "key": 0.202549010515213, "ladder": 0.10832172632217407, "lock": 0.3665195107460022, "person": 0.4242399036884308 } } }
Android Inference Configuration#
In this example the
input_data_specs
are set todatastore type: image
, with each inference execution vs a single image.Settings,
file_path
is set touser_input
. This means that the app will run inference interactively based on user input, typically by pushing a button in the app to pass an image file path. We do not currently support periodic inference for Android.We reference model_specs, the model must have already been trained and saved to your Portal for it to be deployable.
We do not need to set output_data_specs, currently all Android inference results are returned to the app as a JSON response as outlined in the
imageInference
function above.
1name: image_inference
2model_specs:
3 name: model_octaipipe
4 type: keras
5 version: "1"
6input_data_specs:
7 datastore_type: image
8 settings:
9 file_path: user_input
10run_specs:
11 mode: interactive
12 only_last_batch: false
Android Inference Deployment#
Define the list of device ids to deploy to, this can be one or many.
For Android deployments we do not need to specify an image_name as inference functionality is already built into the OctaiPipe Android library.
Set the pipelines to image_inference and the config_path for your inference configuration as per above.
1name: deployment_config
2
3device_ids: [your-device-id]
4
5env: {}
6
7pipelines:
8 - image_inference:
9 config_path: configs/android_inference_config.yml