tflite2onnx

Convert TensorFlow Lite models to ONNX


Keywords
tflite, onnx, deep-learning, model-converter, pip, tensorflow
License
Apache-2.0
Install
pip install tflite2onnx==0.4.1

Documentation

tflite2onnx - Convert TensorFlow Lite models to ONNX

Build and Test Sanity Coverage

tflite2onnx converts TensorFlow Lite (TFLite) models (*.tflite) to ONNX models (*.onnx), with data layout and quantization semantic properly handled (check the introduction blog for detail).

Highlights

  • If you'd like to convert a TensorFlow model (frozen graph *.pb, SavedModel or whatever) to ONNX, try tf2onnx. Or, you can firstly convert it to a TFLite (*.tflite) model, and then convert the TFLite model to ONNX.

  • Microsoft has implemented another TensorFlow Lite to ONNX model converter in tf2onnx at Feb 2021 (we open sourced tflite2onnx in May 2020). tf2onnx seems to able to convert Quantization just like us, and it seems able to convert RNN networks which we are not supported yet. Please try tf2onnx --tflite if tflite2onnx missing any functionality.

Installation

Install via pip pip install tflite2onnx.

Or install from source to get latest features (please try out with virtualenv):

  1. Download the repo: git clone https://github.com/zhenhuaw-me/tflite2onnx.git && cd tflite2onnx
  2. Build the package: ./scripts/build-wheel.sh
  3. Install the built package: pip install assets/dist/tflite2onnx-*.whl

Or you can just add the code tree to your $PYTHONPATH. (Command line tool is not avaiable in this mode.)

export PYTHONPATH=$(pwd):${PYTHONPATH}

Usage

Python Interface

import tflite2onnx

tflite_path = '/path/to/original/tflite/model'
onnx_path = '/path/to/save/converted/onnx/model'

tflite2onnx.convert(tflite_path, onnx_path)

tflite2onnx now supports explicit layout, check the test example.

Command Line

tflite2onnx /path/to/original/tflite/model /path/to/save/converted/onnx/model

Documentation

Contributing

Check contribution guide for more.

License

Apache License Version 2.0.