Running sample TensorFlow iOS apps
In the last two sections of this chapter, we'll test run three sample iOS apps and four sample Android apps that come with TensorFlow 1.4 to make sure you have your mobile TensorFlow development environments set up correctly and give you a quick preview at what some TensorFlow mobile apps can do.
The source code of the three sample TensorFlow iOS apps is located at tensorflow/examples/ios: simple, camera, and benchmark. To successfully run these samples, you need to first download one pretrained deep learning model by Google, called Inception (https://github.com/tensorflow/models/tree/master/research/inception), for image recognition. There are several versions of Inception: v1 to v4, with better accuracy in each newer version. Here we'll use Inception v1 as the samples were developed for it. After downloading the model file, copy the model-related files to each of the samples' data folder:
curl -o ~/graphs/inception5h.zip https://storage.googleapis.com/download.tensorflow.org/models/inception5h.zip unzip ~/graphs/inception5h.zip -d ~/graphs/inception5h cd tensorflow/examples/ios cp ~/graphs/inception5h/* simple/data/ cp ~/graphs/inception5h/* camera/data/ cp ~/graphs/inception5h/* benchmark/data/
Now, go to each app folder and run the following commands to download the required pod for each app before opening and running the apps:
cd simple pod install open tf_simple_example.xcworkspace cd ../camera pod install open tf_camera_example.xcworkspace cd ../benchmark pod install open tf_benchmark_example.xcworkspace
You can then run the three apps on an iOS device, or the simple and benchmark apps on an iOS simulator. If you tap the Run Model button after running the simple app, you'll see a text message saying that the TensorFlow Inception model is loaded, followed by several top recognition results along with confidence values.
If you tap the Benchmark Model button after running the benchmark app, you'll see the average time it takes to run the model for over 20 times. For example, it takes an average of about 0.2089 seconds on my iPhone 6, and 0.0359 seconds on the iPhone 6 simulator.
Finally, running the camera app on an iOS device and pointing the device camera around shows you the objects the app sees and recognizes in real time.