[Relay] Stability?


#1

How stable is each frontend in Relay?


#2

By stability I’m assuming you mean maturity?

We should have harder data for this in the next few weeks as we are looking to measure model importer coverage across a large number of models and framework formats in a continuously updated public dashboard. Please stay tuned!

CC @thierry.


#3

Just polling interest, but would it help to have an up to date webpage that indicates importer success into Relay for a collection of workloads expressed in different frameworks?

The page would also keep track of which versions of TF, ONNX, Pytorch etc. are working with the Relay importers.


#4

+1

I think that’s a great idea. Along with supported ops and versions, we could also have some “best practices”. For example, changes in the graph so the model will run better on TVM (like substituting a dynamic RNN for a static RNN, or removing dropout and assert). These docs will have to evolve over time as ops are added and best practices change.

If the community thinks this would be useful, I’d be willing to take a stab at some TensorFlow docs, since that’s what I’m most familiar with.


#5

@jonso that would be very useful!


#6

I just encounter a Tensorflow model which all Ops seem to have been supported by TVM , but TVM tensorflow front end still fails to import it. If you are interested, just let me know, i would like to share the freeze pb file to help better improve the TVM tensorflow frontend.


#7

Perfect, thanks! What license could you share the model under? We’re pretty flexible, but the more permissive the better and quicker we can get to it.


#8

I just sent out a PR to update the TensorFlow frontend docs here. Let me know what you think!