4

My intended task is:

  1. create checkpoint files for my custom classes (one class) using inception V3 arch.: Inception in TensorFlow
  2. freeze them to the protobuf (.pb) using freeze_graph
  3. optimize the frozen graph using optimize_for_inference
  4. Use pb file in android TF camera demo for classification: TensorFlow Android Camera Demo

In step 1, while training, batch size is set to 1. Also added, images = tf.identity(images, name='Inputs_layer') to name the tensor network as suggested in the question No Operation named [input] in the Graph" error while fine tuning/retraining inceptionV1 slim model.

Before step 3,

>> bazel-bin/tensorflow/tools/graph_transforms/summarize_graph --
   in_graph=frozen_graph.pb
   No inputs spotted.
   No variables spotted.
   Found 1 possible outputs: (name=tower_0/logits/predictions, op=Softmax)
   Found 21781804 (21.78M) const parameters, 0 (0) variable parameters, and 188 
   control_edges
   Op types used: 777 Const, 378 Mul, 284 Add, 283 Sub, 190 Identity, 188 Sum, 
   96 Reshape, 94
   Conv2D, 94 StopGradient, 94 SquaredDifference, 94 Square, 94 Mean, 94 Rsqrt, 
   94 Relu, 94
   Reciprocal, 15 ConcatV2, 10 AvgPool, 4 MaxPool, 1 RealDiv, 1 RandomUniform, 1
   QueueDequeueManyV2, 1 Softmax, 1 Split, 1 MatMul, 1 Floor, 1 FIFOQueueV2, 1 
   BiasAdd

In step 3,

bazel-bin/tensorflow/python/tools/optimize_for_inference \
   --input=tensorflow/examples/android/assets/frozen_graph.pb \
   --output=tensorflow/examples/android/assets/stripped_graph.pb \
   --input_names=inputs_layer \
   --output_names=tower_0/logits/predictions

After step 3,

 >>> bazel-bin/tensorflow/tools/graph_transforms/summarize_graph --
   in_graph=stripped_graph.pb
   No inputs spotted.
   No variables spotted.
   Found 1 possible outputs: (name=tower_0/logits/predictions, op=Softmax) 
   Found 21781804 (21.78M) const parameters, 0 (0) variable parameters, and 188 
   control_edges
   Op types used: 777 Const, 378 Mul, 284 Add, 283 Sub, 188 Sum, 96 Reshape, 94 
   Conv2D, 94 StopGradient, 94 SquaredDifference, 94 Square, 94 Mean, 94 Rsqrt, 
   94 Relu, 94 Reciprocal, 15 ConcatV2, 10 AvgPool, 4 MaxPool, 1 RealDiv, 1 
   RandomUniform, 1 QueueDequeueManyV2, 1 Softmax, 1 Split, 1 MatMul, 1 Floor, 1 
   FIFOQueueV2, 1 BiasAdd
   To use with tensorflow/tools/benchmark:benchmark_model try these arguments:
   run tensorflow/tools/benchmark:benchmark_model -- --
   graph=stripped_graph.pb --show_flops --logtostderr --input_layer= --
   input_layer_type= --input_layer_shape= --
   output_layer=tower_0/logits/predictions

In ClassifierActivity.java,

private static final int INPUT_SIZE = 224;//299; //224;  
private static final int IMAGE_MEAN = 117;
private static final float IMAGE_STD = 1;
private static final String INPUT_NAME = "inputs_layer";
private static final String OUTPUT_NAME = "tower_0/logits/predictions";
private static final String MODEL_FILE = 
                       "file:///android_asset/stripped_graph.pb";
private static final String LABEL_FILE =
      "file:///android_asset/custom_label.txt";

After following above 4 steps, APK crash log on an Android device:

E/AndroidRuntime( 8558): FATAL EXCEPTION: inference
   E/AndroidRuntime( 8558): Process: org.tensorflow.demo, PID: 8558
   E/AndroidRun time( 8558): java.lang.IllegalArgumentException: No Operation 
   named [inputs_layer] in the Graph

How to fix this?

mrk
  • 8,059
  • 3
  • 56
  • 78
Santle Camilus
  • 945
  • 2
  • 12
  • 20
  • Looks like the build succeeded so the problem is likely unrelated to Bazel. If you agree, could you please remove the "bazel" tag? – László May 04 '17 at 11:39
  • Bazel tag is removed. Thanks. – Santle Camilus May 04 '17 at 16:51
  • @Dr.SantleCamilus any progress on this issue? I have got a similar question on why the pretrained->frozen->quantized inception-v3 has input=input and output=InceptionV3/Reshape_1 instead of 'Mul' and 'Softmax'. Here is my question: https://stackoverflow.com/questions/50941369/inconsistency-between-names-of-the-input-and-output-in-tensorflows-models – Amir Jun 20 '18 at 15:57

1 Answers1

1

When you are optimizing for inference it doesn't have the input node name right. You have just given inputs_layer, hence the optimized.pb file is not being correctly recognized in Android.

It said nowhere that your input node is inputs_layer. Give the right input node name and it should work.

JulienD
  • 7,102
  • 9
  • 50
  • 84