This folder contains an example application utilizing TensorFlow for Android devices.
The demos in this folder are designed to give straightforward samples of using TensorFlow in mobile applications.
Inference is done using the TensorFlow Android Inference
Interface, which may be built separately
if you want a standalone library to drop into your existing application. Object
tracking and efficient YUV -> RGB conversion are handled by
libtensorflow_demo.so
.
A device running Android 5.0 (API 21) or higher is required to run the demo due to the use of the camera2 API, although the native libraries themselves can run on API >= 14 devices.
- TF Classify: Uses the Google Inception model to classify camera frames in real-time, displaying the top results in an overlay on the camera image.
- TF Detect: Demonstrates an SSD-Mobilenet model trained using the Tensorflow Object Detection API introduced in Speed/accuracy trade-offs for modern convolutional object detectors to localize and track objects (from 80 categories) in the camera preview in real-time.
- TF Stylize: Uses a model based on A Learned Representation For Artistic Style to restyle the camera preview image to that of a number of different artists.
- TF Speech: Runs a simple speech recognition model built by the audio training tutorial. Listens for a small set of words, and highlights them in the UI when they are recognized.
If you just want the fastest path to trying the demo, you may download the nightly build here. Expand the "View" and then the "out" folders under "Last Successful Artifacts" to find tensorflow_demo.apk.
Also available are precompiled native libraries, and a jcenter package that you may simply drop into your own applications. See tensorflow/contrib/android/README.md for more details.
Once the app is installed it can be started via the "TF Classify", "TF Detect", "TF Stylize", and "TF Speech" icons, which have the orange TensorFlow logo as their icon.
While running the activities, pressing the volume keys on your device will toggle debug visualizations on/off, rendering additional info to the screen that may be useful for development purposes.
The simplest way to compile the demo app yourself, and try out changes to the
project code is to use AndroidStudio. Simply set this android
directory as the
project root.
Then edit the build.gradle
file and change the value of nativeBuildSystem
to
'none'
so that the project is built in the simplest way possible:
def nativeBuildSystem = 'none'
While this project includes full build integration for TensorFlow, this setting disables it, and uses the TensorFlow Inference Interface package from JCenter.
Note: Currently, in this build mode, YUV -> RGB is done using a less efficient
Java implementation, and object tracking is not available in the "TF Detect"
activity. Setting the build system to 'cmake'
currently only builds
libtensorflow_demo.so
, which provides fast YUV -> RGB conversion and object
tracking, while still acquiring TensorFlow support via the downloaded AAR, so it
may be a lightweight way to enable these features.
For any project that does not include custom low level TensorFlow code, this is likely sufficient.
For details on how to include this JCenter package in your own project see tensorflow/contrib/android/README.md
Pick your preferred approach below. At the moment, we have full support for Bazel, and partial support for gradle, cmake, make, and Android Studio.
As a first step for all build types, clone the TensorFlow repo with:
git clone --recurse-submodules https://github.com/tensorflow/tensorflow.git
Note that --recurse-submodules
is necessary to prevent some issues with
protobuf compilation.
NOTE: Bazel does not currently support building for Android on Windows. Full support for gradle/cmake builds is coming soon, but in the meantime we suggest that Windows users download the prebuilt binaries instead.
Bazel is the primary build system for TensorFlow. To build with Bazel, it and the Android NDK and SDK must be installed on your system.
-
Install the latest version of Bazel as per the instructions on the Bazel website.
-
The Android NDK is required to build the native (C/C++) TensorFlow code. The current recommended version is 14b, which may be found here.
- NDK 16, the revision released in November 2017, is incompatible with Bazel. See here.
-
The Android SDK and build tools may be obtained here, or alternatively as part of Android Studio. Build tools API >= 23 is required to build the TF Android demo (though it will run on API >= 21 devices).
- The Android Studio SDK Manager's NDK installer will install the latest revision of the NDK, which is incompatible with Bazel. You'll need to download an older version manually, as (2) suggests.
NOTE: As long as you have the SDK and NDK installed, the ./configure
script
will create these rules for you. Answer "Yes" when the script asks to
automatically configure the ./WORKSPACE
.
The Android entries in
<workspace_root>/WORKSPACE
must be uncommented
with the paths filled in appropriately depending on where you installed the NDK
and SDK. Otherwise an error such as: "The external label
'//external:android/sdk' is not bound to anything" will be reported.
Also edit the API levels for the SDK in WORKSPACE to the highest level you have installed in your SDK. This must be >= 23 (this is completely independent of the API level of the demo, which is defined in AndroidManifest.xml). The NDK API level may remain at 14.
The TensorFlow GraphDef
s that contain the model definitions and weights are
not packaged in the repo because of their size. They are downloaded
automatically and packaged with the APK by Bazel via a new_http_archive defined
in WORKSPACE
during the build process, and by Gradle via
download-models.gradle.
Optional: If you wish to place the models in your assets manually, remove
all of the model_files
entries from the assets
list in tensorflow_demo
found in the BUILD
file. Then download and extract the archives
yourself to the assets
directory in the source tree:
BASE_URL=https://storage.googleapis.com/download.tensorflow.org/models
for MODEL_ZIP in inception5h.zip ssd_mobilenet_v1_android_export.zip stylize_v1.zip
do
curl -L ${BASE_URL}/${MODEL_ZIP} -o /tmp/${MODEL_ZIP}
unzip /tmp/${MODEL_ZIP} -d tensorflow/examples/android/assets/
done
This will extract the models and their associated metadata files to the local assets/ directory.
If you are using Gradle, make sure to remove download-models.gradle reference from build.gradle after your manually download models; otherwise gradle might download models again and overwrite your models.
After editing your WORKSPACE file to update the SDK/NDK configuration, you may build the APK. Run this from your workspace root:
bazel build -c opt //tensorflow/examples/android:tensorflow_demo
Make sure that adb debugging is enabled on your Android 5.0 (API 21) or later device, then after building use the following command from your workspace root to install the APK:
adb install -r bazel-bin/tensorflow/examples/android/tensorflow_demo.apk
Android Studio may be used to build the demo in conjunction with Bazel. First, make sure that you can build with Bazel following the above directions. Then, look at build.gradle and make sure that the path to Bazel matches that of your system.
At this point you can add the tensorflow/examples/android directory as a new Android Studio project. Click through installing all the Gradle extensions it requests, and you should be able to have Android Studio build the demo like any other application (it will call out to Bazel to build the native code with the NDK).
Full CMake support for the demo is coming soon, but for now it is possible to build the TensorFlow Android Inference library using tensorflow/contrib/android/cmake.