Run your PyTorch model on Android GPU using libMACE

How to speed up PyTorch model on Android using GPU inference and MACE library

Installation of libMACE

Next, we will look at the steps for installing libMACE.

Converting a model

MACE uses its own format for neural networks representation, so we need to transform the original model. The conversion process consists of several stages. We will look at it using the example of ResNet 50 from the torchvision library.

Configuring the Android Studio project

For our app in Android Studio, we need to specify the type C++ Native Application.

Creating new Android Studio project

Model loading

First, we will add the model loading function to MainActivity.java.

Model inference

For model inference we declare the classification function in MainActivity.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store