Does Relay tensorflow frontend support dynamic input shapes?
Can you clarify what you mean by dynamic input shapes, people often use the terminology to refer to a few different scenarios. Relay supports programs which vary in one or more dimensions with a fixed rank, but TVM currently does not generate optimal code for these cases. We have planned support for “jagged tensor” style tensors as well but have not implemented said support in upstream.
Can you elaborate on your use case?
When we use NNVM to compile a Tensorflow graph, the input shapes should be fixed values so that nnvm could infer shapes before building runtime. I saw Relay frontend is yet being merged and I would like to enquire that if I can compile a graph with dynamic input shapes and generate optimized runtime as well?
Relay frontend currently supports some level of dynamism as you can find in some examples, i.e. https://github.com/dmlc/tvm/blob/master/tests/python/relay/test_op_level5.py#L11. But it is not supported by runtime yet. Memory planning also doesn’t support dynamism so far.
Currently, TVM doesn’t support dynamic input shapes to compile TensorFlow graph, usually tensorflow graphdef uses -1 to indicate dynamic shape, tf frontend in tvm hasn’t supported it yet. As zhiiics said above, the runtime hasn’t supported dynamism either so far.
Thank you so much and by the way, it seems like you are planning to support this shortly, right?