Deployment of converted .pb file to .tflite using relay.frontend.from_tflite


#1

I tried to convert my tensorflow .pb model into .tflite format using TensorFlow lite but it was unable to compile on TVM. To debug this I created a single layer convolutional network then created a .pb file and .tflite file from it using TF lite. TVM was unable to compile even a single layer convolutional network. I went into https://github.com/dmlc/tvm/blob/master/python/tvm/relay/frontend/tflite.py then I found the operator conversion function and tried to print out the converted node name in the below code.

import os
import tflite
from tflite.Model import Model

tflite_model_buf = open(‘single.tflite’, “rb”).read()
tflite_model = Model.GetRootAsModel(tflite_model_buf, 0)

def build_str_map(obj):
“”“Build string map of TFLite enum int value
Parameters
----------
obj:
TFLite class which contains enum int value, such as BuiltInOptions
Returns
-------
String representation map of TFLite class enum int value
“””
ret = {}
for field_name in dir(obj):
if not field_name.startswith(’_’):
field_value = getattr(obj, field_name)
if isinstance(field_value, int):
ret[field_value] = field_name
return ret

subgraph = tflite_model.Subgraphs(0)

for op_idx in range(1):
op = subgraph.Operators(op_idx)

try:
    from tflite.BuiltinOperator import BuiltinOperator
except ImportError:
    raise ImportError("The tflite package must be installed")

op_code_list_idx = op.OpcodeIndex()
op_code_id = tflite_model.OperatorCodes(op_code_list_idx).BuiltinCode()
builtin_op_code =build_str_map(BuiltinOperator())
op_code_str = builtin_op_code[op_code_id]
print(op_code_str)

Print(op_code_str) prints “DEPTHWISE_CONV_2D”. But I created only Conv2d tensorflow layer. This error indicates that there is a mismatch of Schema but I am using the Tensorflow version 1.13.1 and installed the tflite-1.13.1 from https://docs.tvm.ai/tutorials/frontend/from_tflite.html#sphx-glr-tutorials-frontend-from-tflite-py for TVM. Can anyone please comment here to get this conversion from .pb to .tflite to TVM working?


#2

Pls share the tflite file.


#3

@FrozenGene Please find the tflite file in the link https://drive.google.com/file/d/1CRSniHJudUIw0nLl-gjvK9XMO0FvSUTO/view?usp=sharing.

This single layer convolutional network has depth channel of 1. When I changed the depth channel to 2 then the compilation on TVM worked. I guess that TVM is considering the channels with depth = 1 always as “DEPTHWISE_CONV_2D”.


#4

I test this model. I can not reproduce your issue. Your model is not “DEPTHWISE_CONV_2D”, is normal convolution.

I suspect it maybe your tflite package is not correct. Consider installing the prebuilt wheel: https://docs.tvm.ai/tutorials/frontend/from_tflite.html#sphx-glr-tutorials-frontend-from-tflite-py


#5

@FrozenGene By mistake I sent the link to the tflite file which is working on TVM since it has input depth channel =2 for the convolutional layer. Try https://drive.google.com/open?id=1H_NPWpsQKRG-YQ2AZgjKXuULrJVkEAiI . This single layer convolutional network has input channel depth=1 and for this TVM doesn’t compile.