Deploying Tensorflow model on Andorid device for Human Activity Recognition

来源:互联网 发布:衢州学院网络教学平台 编辑:程序博客网 时间:2024/05/16 14:13

In this tutorial, we will learn how to deploy human activity recognition (HAR) model on Android device for real-time prediction. The majority of the code in this post is largely taken from Omid Alemi's simply elegant tutorial named "Build Your First Tensorflow Android App". This post could not have been possible without Omid's contribution. I am building upon his work in this post. If you have not seen the post on how to build a deep convolutional neural network for HAR, please follow this link. Here, details on how to freeze and export the model for use in Android app are discussed.

<pwe need="" to="" have="" frozen="" tensorflow="" graph="" with="" learned="" weights="" that="" will="" be="" imported="" into="" an="" android="" app="" for="" making="" predictions="" using="" accelerometer="" data.="" before="" proceeding="" further="" make="" sure,="" the="" model="" input="" and="" output="" nodes="" given="" valid="" names.="" another="" thing="" notice="" is="" shape="" of="" cnn="" 1="" x="" 90="" 3="" (i.e.="" height,="" width="" channels).="" however,="" from="" android,="" we="" feed="" vector="" size="" 270="" 3)="" can="" reshape="" in="" model.="" doing="" this,="" node="" (none,="" 270)="" added="" top="" earlier="" as="" follows:<="" p="" style="box-sizing: border-box; color: rgb(64, 64, 64); font-family: Lora, "Times New Roman", serif; font-size: 18px; text-align: justify;">
X = tf.placeholder(tf.float32, shape=[None,input_width * num_channels], name="input")X_reshaped = tf.reshape(X,[-1,1,90,3]) 

Now add following lines of code after model training steps to checkpoint and save the graph definition.

with tf.Session() as session:/** ------- Model training code goes here ------**/tf.train.write_graph(session.graph_def, '.', '../har.pbtxt')  saver.save(session,save_path = "../har.ckpt")

Next, the checkpointed model will be frozen and its optimized version can be saved as follows:

from tensorflow.python.tools import freeze_graphfrom tensorflow.python.tools import optimize_for_inference_libfreeze_graph.freeze_graph(input_graph = "../har.pbtxt",  input_saver = "",             input_binary = False, input_checkpoint = "../har.ckpt", output_node_names = "y_",             restore_op_name = "save/restore_all", filename_tensor_name = "save/Const:0",             output_graph = "frozen_har.pb", clear_devices = True, initializer_nodes = "")input_graph_def = tf.GraphDef()with tf.gfile.Open(output_frozen_graph_name, "r") as f:    data = f.read()    input_graph_def.ParseFromString(data)output_graph_def = optimize_for_inference_lib.optimize_for_inference(        input_graph_def,        ["input"],         ["y_"],        tf.float32.as_datatype_enum)f = tf.gfile.FastGFile("optimized_har.pb", "w")f.write(output_graph_def.SerializeToString())

Let's make a simple Android app to get accelerometer data and make predictions about user activities with the learned model. Before doing so, get Tensorflow libraries for Android from the following link. For the purpose of this tutorial, I used the libraries with artifact number 117.

We will now create an empty app and copy libandroid_tensorflow_inference_java.jararm64-v8aarmeabi-v7ax86 and x86_64 files/folders in libs directory of the project. Also, copy the frozen optimized model into an asset directory of the project. The figure shows directory structure after copying all the required files.

Directory Structure

Afterwards, add following entry in build.gradle file to let build system know where files are located.

sourceSets{main {jniLibs.srcDirs = ['libs']}}

Now we create a class, which will use TensorflowInferenceInterface for feeding data to a model, running inference and getting back results.

import android.content.Context;import android.content.res.AssetManager;import org.tensorflow.contrib.android.TensorFlowInferenceInterface;public class ActivityInference {    static {        System.loadLibrary("tensorflow_inference");    }    private static ActivityInference activityInferenceInstance;    private TensorFlowInferenceInterface inferenceInterface;    private static final String MODEL_FILE = "file:///android_asset/optimized_har.pb";    private static final String INPUT_NODE = "input";    private static final String[] OUTPUT_NODES = {"y_"};    private static final String OUTPUT_NODE = "y_";    private static final long[] INPUT_SIZE = {1,270};    private static final int OUTPUT_SIZE = 6;    private static AssetManager assetManager;    public static ActivityInference getInstance(final Context context)    {        if (activityInferenceInstance == null)        {            activityInferenceInstance = new ActivityInference(context);        }        return activityInferenceInstance;    }    public ActivityInference(final Context context) {        this.assetManager = context.getAssets();        inferenceInterface = new TensorFlowInferenceInterface(assetManager, MODEL_FILE);    }    public float[] getActivityProb(float[] input_signal)    {        float[] result = new float[OUTPUT_SIZE];        inferenceInterface.feed(INPUT_NODE,input_signal,INPUT_SIZE);        inferenceInterface.run(OUTPUT_NODES);        inferenceInterface.fetch(OUTPUT_NODE,result);        //Downstairs   Jogging      Sitting  Standing   Upstairs   Walking        return result;    }}

The rest of the code provided below is from MainActivity class. It is listening to accelerometer sensor and using getActivityProb method of ActivityInference class for getting a probability of each class to update the UI.

import android.content.Context;import android.hardware.Sensor;import android.hardware.SensorEvent;import android.hardware.SensorEventListener;import android.hardware.SensorManager;import android.os.Bundle;import android.support.v7.app.AppCompatActivity;import android.widget.TextView;import java.math.BigDecimal;import java.util.ArrayList;import java.util.List;public class MainActivity extends AppCompatActivity implements SensorEventListener {    private final int N_SAMPLES = 90;    private static List x;    private static List y;    private static List z;    private static List input_signal;    private SensorManager mSensorManager;    private Sensor mAccelerometer;    private ActivityInference activityInference;    private TextView downstairsTextView;    private TextView joggingTextView;    private TextView sittingTextView;    private TextView standingTextView;    private TextView upstairsTextView;    private TextView walkingTextView;    @Override    protected void onCreate(Bundle savedInstanceState) {        super.onCreate(savedInstanceState);        setContentView(R.layout.activity_main);        x = new ArrayList();        y = new ArrayList();        z = new ArrayList();        input_signal = new ArrayList();        downstairsTextView = (TextView)findViewById(R.id.downstairs_prob);        joggingTextView = (TextView)findViewById(R.id.jogging_prob);        sittingTextView = (TextView)findViewById(R.id.sitting_prob);        standingTextView = (TextView)findViewById(R.id.standing_prob);        upstairsTextView = (TextView)findViewById(R.id.upstairs_prob);        walkingTextView = (TextView)findViewById(R.id.walking_prob);        mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);        mAccelerometer = mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);        mSensorManager.registerListener(this, mAccelerometer , SensorManager.SENSOR_DELAY_FASTEST);        activityInference = new ActivityInference(getApplicationContext());    }    protected void onPause() {        super.onPause();        mSensorManager.unregisterListener(this);    }    protected void onResume() {        super.onResume();        mSensorManager.registerListener(this, mAccelerometer, SensorManager.SENSOR_DELAY_FASTEST);    }    @Override    public void onSensorChanged(SensorEvent event) {        activityPrediction();        x.add(event.values[0]);        y.add(event.values[1]);        z.add(event.values[2]);    }    @Override    public void onAccuracyChanged(Sensor sensor, int i) {    }    private void activityPrediction()    {        if(x.size() == N_SAMPLES && y.size() == N_SAMPLES && z.size() == N_SAMPLES) {            // Copy all x,y and z values to one array of shape N_SAMPLES*3            input_signal.addAll(x); input_signal.addAll(y); input_signal.addAll(z);            // Perform inference using Tensorflow            float[] results = activityInference.getActivityProb(toFloatArray(input_signal));            downstairsTextView.setText(Float.toString(round(results[0],2)));            joggingTextView.setText(Float.toString(round(results[1],2)));            sittingTextView.setText(Float.toString(round(results[2],2)));            standingTextView.setText(Float.toString(round(results[3],2)));            upstairsTextView.setText(Float.toString(round(results[4],2)));            walkingTextView.setText(Float.toString(round(results[5],2)));            // Clear all the values            x.clear(); y.clear(); z.clear(); input_signal.clear();        }    }    private float[] toFloatArray(List list)    {        int i = 0;        float[] array = new float[list.size()];        for (Float f : list) {            array[i++] = (f != null ? f : Float.NaN);        }        return array;    }    public static float round(float d, int decimalPlace) {        BigDecimal bd = new BigDecimal(Float.toString(d));        bd = bd.setScale(decimalPlace, BigDecimal.ROUND_HALF_UP);        return bd.floatValue();    }}

This is all that we need to use Tensorflow on Android for making predictions about user activities. A simple app UI will look something as shown in the screenshot below.

HAR App Screenshot

The complete code of the app is available at the following link. If you have any question, please comment below.

原文地址: http://aqibsaeed.github.io/2017-05-02-deploying-tensorflow-model-andorid-device-human-activity-recognition/

阅读全文
0 0
原创粉丝点击