Dynamic concat and reshape_like causes TVMError: Not all Vars are passed in api_args: 'any_dim' is not bound to any variables

When I am trying to use relay.vm.compile to compile a network that has some operators with dynamic shape, TVM failed and display the error message

TVMError: Not all Vars are passed in api_args: 'any_dim'  is not bound to any variables

VMCompiler.lower works fine, and this error happens in compiler.codegen()

The entire failure log


Traceback (most recent call last):
  File "post_process_lab.py", line 250, in <module>
    compile_with_cpu()
  File "post_process_lab.py", line 187, in compile_with_cpu
    exe = relay.vm.compile(mod, target=target, params=params)
  File "/root/Codes/tvm_in_mac/python/tvm/relay/backend/vm.py", line 72, in compile
    compiler.codegen()
  File "/root/Codes/tvm_in_mac/python/tvm/relay/backend/vm.py", line 141, in codegen
    self._codegen()
  File "/root/Codes/tvm_in_mac/python/tvm/_ffi/_ctypes/packed_func.py", line 225, in __call__
    raise get_last_ffi_error()
tvm._ffi.base.TVMError: Traceback (most recent call last):
  [bt] (8) /root/Codes/tvm_in_mac/build/libtvm.so(tvm::build(tvm::Map<tvm::runtime::String, tvm::IRModule, void, void> const&, tvm::Target const&)+0x63c) [0x7f7a443e7c7c]
  [bt] (7) /root/Codes/tvm_in_mac/build/libtvm.so(tvm::build(tvm::Map<tvm::Target, tvm::IRModule, void, void> const&, tvm::Target const&)+0x252) [0x7f7a443e6e92]
  [bt] (6) /root/Codes/tvm_in_mac/build/libtvm.so(tvm::SplitDevHostFuncs(tvm::IRModule, tvm::Target const&, tvm::Target const&, tvm::transform::PassContext const&)+0x3ba) [0x7f7a443e5cfa]
  [bt] (5) /root/Codes/tvm_in_mac/build/libtvm.so(tvm::transform::Pass::operator()(tvm::IRModule) const+0x66) [0x7f7a443ecfe6]
  [bt] (4) /root/Codes/tvm_in_mac/build/libtvm.so(tvm::transform::SequentialNode::operator()(tvm::IRModule, tvm::transform::PassContext const&) const+0x3e6) [0x7f7a44436166]
  [bt] (3) /root/Codes/tvm_in_mac/build/libtvm.so(tvm::transform::ModulePassNode::operator()(tvm::IRModule, tvm::transform::PassContext const&) const+0x14e) [0x7f7a44434afe]
  [bt] (2) /root/Codes/tvm_in_mac/build/libtvm.so(+0xfc075c) [0x7f7a446c075c]
  [bt] (1) /root/Codes/tvm_in_mac/build/libtvm.so(tvm::tir::MakePackedAPI(tvm::tir::PrimFunc&&, int)+0x28ef) [0x7f7a446bdaef]
  [bt] (0) /root/Codes/tvm_in_mac/build/libtvm.so(dmlc::LogMessageFatal::~LogMessageFatal()+0x67) [0x7f7a44301cd7]
  File "/root/Codes/tvm_in_mac/src/tir/transforms/make_packed_api.cc", line 208
TVMError: Not all Vars are passed in api_args:  'any_dim'  'any_dim'  'any_dim'  is not bound to any variables

This bug happens when a concat operator has dynamic reshape_like operators as previous operators.

This can be reproduce by this script

import tvm
import tensorflow as tf
from tvm import relay

def main():
    # create graph
    debug_graph = tf.Graph()
    with debug_graph.as_default():
        input_1 = tf.placeholder(dtype=tf.float32, shape=[None,], name='input_1')
        input_1_shape = tf.shape(input_1, name='shape_1')
        reshape_1 = tf.reshape(input_1, input_1_shape, name='reshape_1')

        input_2 = tf.placeholder(dtype=tf.float32, shape=[None,], name='input_2')
        input_2_shape = tf.shape(input_2, name='shape_2')
        reshape_2 = tf.reshape(input_2, input_2_shape, name='reshape_2')
        
        result = tf.concat([reshape_1, reshape_2], 0, name='result')

    # create ir module
    layout = "NHWC"
    mod, params = relay.frontend.from_tensorflow(
        debug_graph.as_graph_def(),
        layout=layout,
        outputs=['result']
    )

    # 
    target = "llvm"
    context = tvm.cpu()
    exe = relay.vm.compile(mod, target=target, params=params)


if __name__ == '__main__':
    main()



I tried with a basic te.compte() example with dynamic iterVar, and encountered the same issue. So I think te.compute does not support dynamic shape iterator. Please let me know if you have found a solution, thanks.