Tensorflow Lite Android Object Detection — Mobile SSD models are expected to have exactly 4 outputs, found 8

Question:

PROBLEM ENCOUNTERED:

E/AndroidRuntime: FATAL EXCEPTION: main
Process: org.tensorflow.lite.examples.detection, PID: 14719
java.lang.AssertionError: Error occurred when initializing ObjectDetector: Mobile SSD models are expected to have exactly 4 outputs, found 8

Problem Description

  • Android Application Source: TensorFlow Lite Object Detection Example from Google
  • Error shown when starting the Example Application

Model Description

  • Custom Model Used? YES

  • Pre-trained Model Used: ssd_mobilenet_v2_fpnlite_640x640_coco17_tpu-8

  • Inference type: FLOAT

  • Number of classes: 4

System Information

  • OS Platform and Distribution: ( Linux Ubuntu 20.14)
  • TensorFlow Version: 2.4.1
  • TensorFlow installed from: Pip

Saved Model conversion commands used:

1. Saved_Model.pb export:

python ./exporter_main_v2.py
–input_type image_tensor
–pipeline_config_path ./models/ssd_mobilenet_v2_fpnlite_640x640_coco17_tpu-8/pipeline.config
–trained_checkpoint_dir ./models/ssd_mobilenet_v2_fpnlite_640x640_coco17_tpu-8
–output_directory exported_models/tflite

2. Convert saved model (.pb) to tflite

toco
–saved_model_dir ./exported-models/tflite/saved_model
–emit-select-tf-ops true
–allow_custom_ops
–graph_def_file ./exported-models/tflite/saved_model/saved_model.pb
–output_file ./exported-models/tflite/tflite/detect.tflite
–input_shapes 1,300,300,3
–input_arrays normalized_input_image_tensor
–output_arrays ‘TFLite_Detection_PostProcess’,’TFLite_Detection_PostProcess:1′,’TFLite_Detection_PostProcess:2′,’TFLite_Detection_PostProcess:3’
–inference_type=FLOAT
–allow_custom_ops

Remarks
I am trying to use a trained custom model on the Google TensorFlow lite provided example. Just that every time I open the application, it returns such an error, Mobile SSD models are expected to have exactly 4 outputs, found 8. The model is trained to identify 4 classes, all stated in the labelmap.txt and pipeline config.

Does anybody have any clue about this error?

Asked By: Alex1338

||

Answers:

After further study, I believe the aforementioned issue was raised since the model has 8 tensors output but the Android application written in Java can only support 4 tensors output (at least the example provided by Google only supports 4 tensors output)

I am not very certain about the number of tensors output on different models. So far as I understood and messed around with different models, models with fixed_shape_resizer of 640 x 640 are likely to require more than 4 tensors output ( 8 tensors output usually), which is not compatible with the Android application written in Java.

For any amateur users like me, please find the following prerequisites to use your custom model in the Android application
Suggested Setup ( Assume you are using TensorFlow version >= 2.3):

  • TensorFlow Model: SSD model with fixed_shape_resizer of 320 x 320
    (In my case, SSD MobileNet v2 320×320 works perfectly fine)
    (The tensors output has to be 4)

  • Colab ( Perfect for model training and conversion)
    ( I’ve tried to perform the training and conversion on both Linux and Windows platform on my local machine, the incompatibility of different tools and package gives me a headache. I turned out using Colab to perform the training and conversion. It is much more powerful and has great compatibility with those training tools and script.)

  • The metadata writer library that was written by @lu-wang-g
    (In my case, after converting the trained model to .tflite, if you directly migrate the .tflite model to the Android application, the application will report tons of problem regarding the config of the .tflite model. Assume if you trained and converted the model correctly, all you need is the metadata writer library above. It will automatically configure the metadata for you according to the .tflite model. Then you can directly migrate the model to the application.)

For detail, please visit my GitHub issue:
https://github.com/tensorflow/tensorflow/issues/47595

Answered By: Alex1338

For those who will stumble on this problem/question later: limitations on the number of output tensors are part of Tensorflow Lite Object Detection API specification described here
I don’t know how to make a model compatible with this requirements yet, but will append my answer if/when I will figure that out.

UPDATE

Here is the official Google Colab with an example of a model conversion. The interesting part is that script call:

python models/research/object_detection/export_tflite_graph_tf2.py 
    --trained_checkpoint_dir {'ssd_mobilenet_v2_fpnlite_640x640_coco17_tpu-8/checkpoint'} 
    --output_directory {'ssd_mobilenet_v2_fpnlite_640x640_coco17_tpu-8/tflite'} 
    --pipeline_config_path {'ssd_mobilenet_v2_fpnlite_640x640_coco17_tpu-8/pipeline.config'}

The script doesn’t convert your model, but makes the model compatible with TFLite in terms of used operations and inputs/outputs format.
A comment inside the script claims that only SSD meta-architectures are supported (also stated here). Also in the same directory of the repo where this script comes from there are other scripts that seem to be doing similar things, though no clear description given.

Answered By: Petr Rybakov