Moving up to Tensorflow 1.15.0 and Tflite 1.15.0 and other Framework upgrades

Hi,

We are starting to require newer versions of tflite as part of running some of the models that we have . Further we are also noticing a whole bunch of tests especially for SPLIT, unpack and others that need special handling in the testsuite for tflite versions as tflite 1.13 is buggy in the way it creates models for Split and unpack. A lot of our tests are written in a way that runs the same tests in both the original framework and via TVM. However we’ve seen atleast 2 operators (SPLIT and UNPACK) that are broken when creating the models using the tflite converter ?

Does anyone else in the community have any experience with moving up to 1.15.0 in our CI system for Tensorflow and TFlite so that the newer operators are supported and bugs are fixed with respect to creating models ? I’ve been slowly working through a bunch of failures , however I wanted to check in on this with the community before spending more time on it.

Further we need to think about the following wider topics.

  1. Tensorflow 2.0 and TFlite 2.0 - how do we plan to support these ? It sounds to me like TF 2.0 is substantially different from TF 1.x that we need 2 different strands of activity and potentially another strand of CI testing.

  2. While we are here, we should look to get to a general pragmatic policy around upgrades to the relevant networks. For eg. questions to answer are whether the last stable version is good enough everywhere as we need to be able to handle recent enough operators and we potentially need to think about upgrades regularly ?

Regards, Ramana

@janimesh, @FrozenGene

Any thoughts on this ?

Ramana

@ramana-arm are TF 1.13.1 and 1.14 are both properly supported in TVM at the moment?

A plan of moving up to TF 1.14 will be great!

Moving to TF 2.0 and supporting old versions with compat.v1 would be a decent move. I have analysed it and it look feasible.

This required some changes to existing front end and a lot of effort on test cased based on TF 2.0. Also require additional CI setup too.

I could propose a PR for TF 2.0 upgrade.

TF 2.0+ is a big change compared 1.x and we should be careful of it, especially the quantized model of TFLite is changed (input type is changed from uint8 to int8) (https://www.tensorflow.org/lite/performance/quantization_spec). Except this, I agree we upgrade to TF 2.0+ and use old compact model to maintain old tf version’s test.

1 Like

Zhao , I agree with you , I think we need a scheme to support both the TFlite 1.x quantization story, I will need to check but from my perspective there will be users who want to use tf1.x and TF2.x quantization story. It is also the change from per-tensor quantization to per-channel quantization.

Ramana

To move this forward, I spent some time over the past few days to get both TF1.15 and TF2.x testing with our CI and ran into a few issues.

See

regards Ramana