Skip to content

OE 35. TFLite support

Dmitry Kurtaev edited this page Apr 13, 2023 · 2 revisions

Add import for TFLite models format

  • Author: Julia Bareeva
  • Link: #13918
  • Status: Done (#23161)
  • Platforms: All
  • Complexity: 1-2 man-months

Introduction and Rationale

TensorFlow Lite is a framework for on-device inference. Usually, a model file size can be very large but if we convert it to TFLite it can become mobile-friendly and be used on small devices. Also, TFLite supports quantized networks and could be a good platform for quantization support experiments in OpenCV.

Proposed solution

We can support import from *.tflite files in the same way we do for *.pb files (TensorFlow format). To do this, we need to be able to parse files in Flatbuffer format and generate schema. Technical details:

Impact on existing code, compatibility

In general, the existing interface shouldn't change much.

Possible alternatives

TFLite models can be converted to a frozen TensorFlow graphs:

bazel run --config=opt //tensorflow/lite/toco:toco -- --input_file=model.tflite --output_file=graph.pb --input_format=TFLITE --output_format=TENSORFLOW_GRAPHDEF

But this doesn't work for all. For example, there are several known problems for mediapipe models: https://github.com/google/mediapipe/issues/2770

References

Related feature requests from OpenCV forum:

Does readNetFromTensorflow support ".tflite" format?

Include .tflite or .pb files

Tensorflow lite Graph with OpenCV DNN

Clone this wiki locally