Running tf-lite models on TVM


We can run various tf-lite models with TVM, I tried this example, and it worked:

This is the tflite model zoo :

Some tf-lite models use asymmetric quantization, so I’m not sure that those TFlite models will be easily imported into relay or TVM.

So I doubt if I can repeat the same example in tutorial with all tflite examples in the model zoo. If anyone has any idea, it would be greatly helpful.


Running pre-quantized TFLite models is an ongoing effort. The latest PR tackles quantized Inceptionv1. You can try different models on TFLite hosted repo, I think everything works except InceptionV3


Oh okay…
thanks for your reply.

also, I tried a classification model mobilenet_v1 and it could compile on TVM successfully.

But with mobilenet_ssd for object detection, it seems I cannot correctly tackle the data type (dtype) for the input tensor, in the function from_tflite .

Though mobilenet_ssd can work from the GluonCV library , by importing it through from_mxnet fucntion (as in this example: ) , but I wanted to run the graph for a quantized ssd model using the tflite function.

Any idea if mobilenet_ssd model from tflite is supported yet by TVM?


TF(Lite) mobilenet_ssd (even FP32) is little too away to be supported. TF has special control flow operators for SSD that requires lot of complicated work to be done in Relay/TVM (The project is called Relay VM). TVM community is working continuously on that. But, expect atleast a couple of months before that is realizable. After FP32 is supported, I will start looking into int8.