When infering reshape argument by precompute, cause HalideIR assert fai

Hi.

When using relay.frontend.onnx to read my own .onnx files. There is a reshape op in my onnx-model which input[1] is from last op output not a constant. So relay needs to bulid previous func to get input[1]'s shape.

However, when it runs to:

        try:
            key = _get_cache_key(source_func, target)
            return _backend._CompileEngineLower(self, key)
        except Exception:
            import traceback
            msg = traceback.format_exc()
            msg += "Error during compile func\n"
            msg += "--------------------------\n"
            msg += source_func.astext(show_meta_data=False)
            msg += "--------------------------\n"
            raise RuntimeError(msg)

It will get a exception, I track the key value is:

v0.0.1
%1 = fn (%p0: Tensor[(4,), int64], %p1: int64, __dict__=meta[StrMap][0]) -> int64 {
  %0 = take(%p0, %p1, axis=0) // ty=int64
  %0
}
%1
// meta data omitted. you can use show_meta_data=True to include meta data

And the c_err_msg from _LIB.TVMGetLastError() are:

[10:39:14] /home/prototype/Downloads/tvm/3rdparty/HalideIR/src/ir/IR.cpp:468: Check failed: args[i].type() == Int(32) Args to call to halide function must be type Int(32)


Stack trace returned 10 entries:
[bt] (0) /home/prototype/Downloads/tvm/build/libtvm.so(+0x77ede0) [0x7f4124897de0]
[bt] (1) /home/prototype/Downloads/tvm/build/libtvm.so(+0x77e9ad) [0x7f41248979ad]
[bt] (2) /home/prototype/Downloads/tvm/build/libtvm.so(HalideIR::Internal::Call::make(HalideIR::Type, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, tvm::Array<HalideIR::Expr, void>, HalideIR::Internal::Call::CallType, HalideIR::IR::FunctionRef, int)+0x2e4) [0x7f41250128b4]
[bt] (3) /home/prototype/Downloads/tvm/build/libtvm.so(tvm::Tensor::operator()(tvm::Array<HalideIR::Expr, void>) const+0x2d3) [0x7f4124a1f3c3]
[bt] (4) /home/prototype/Downloads/tvm/build/libtvm.so(+0xc87d3e) [0x7f4124da0d3e]
[bt] (5) /home/prototype/Downloads/tvm/build/libtvm.so(+0xc874dc) [0x7f4124da04dc]
[bt] (6) /home/prototype/Downloads/tvm/build/libtvm.so(tvm::compute(tvm::Array<HalideIR::Expr, void>, std::function<HalideIR::Expr (tvm::Array<tvm::Var, void> const&)>, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, tvm::Map<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, tvm::NodeRef, void, void>)+0x767) [0x7f4124a22e17]
[bt] (7) /home/prototype/Downloads/tvm/build/libtvm.so(+0xc759c3) [0x7f4124d8e9c3]
[bt] (8) /home/prototype/Downloads/tvm/build/libtvm.so(+0xc52ab1) [0x7f4124d6bab1]
[bt] (9) /home/prototype/Downloads/tvm/build/libtvm.so(+0xb73555) [0x7f4124c8c555]

I’m not sure whether the reason is the args’ type put into halide IR is int64, which is not int32, or something else?

The complete log is :

WARNING:root:Infering Reshape argument by precompute
Traceback (most recent call last):
  File "/home/prototype/Downloads/tvm/python/tvm/relay/backend/compile_engine.py", line 92, in lower
    return _backend._CompileEngineLower(self, key)
  File "/home/prototype/Downloads/tvm/python/tvm/_ffi/_ctypes/function.py", line 206, in __call__
    raise get_last_ffi_error()
  File "/home/prototype/Downloads/tvm/python/tvm/_ffi/base.py", line 297, in get_last_ffi_error
    if err_type.startswith("tvm.error."):
AttributeError: 'NoneType' object has no attribute 'startswith'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/prototype/Desktop/Deep-Learning/Pytorch-Learn/tvm/tvm_ssd.py", line 36, in <module>
    sym, params = relay.frontend.from_onnx(onnx_model, shape_dict)
  File "/home/prototype/Downloads/tvm/python/tvm/relay/frontend/onnx.py", line 1143, in from_onnx
    sym, params = g.from_onnx(graph, opset)
  File "/home/prototype/Downloads/tvm/python/tvm/relay/frontend/onnx.py", line 983, in from_onnx
    op = self._convert_operator(op_name, inputs, attr, opset)
  File "/home/prototype/Downloads/tvm/python/tvm/relay/frontend/onnx.py", line 1089, in _convert_operator
    sym = convert_map[op_name](inputs, attrs, self._params)
  File "/home/prototype/Downloads/tvm/python/tvm/relay/frontend/onnx.py", line 358, in _impl_v1
    graph, lib, params = tvm.relay.build(func, target="llvm", params=params)
  File "/home/prototype/Downloads/tvm/python/tvm/relay/build_module.py", line 303, in build
    graph_json, lowered_funcs, params = graph_gen.codegen(func)
  File "/home/prototype/Downloads/tvm/python/tvm/relay/backend/graph_runtime_codegen.py", line 443, in codegen
    self.heads = self.visit(func.body)
  File "/home/prototype/Downloads/tvm/python/tvm/relay/expr_functor.py", line 46, in visit
    res = self.visit_call(expr)
  File "/home/prototype/Downloads/tvm/python/tvm/relay/backend/graph_runtime_codegen.py", line 286, in visit_call
    res = self.visit(arg)
  File "/home/prototype/Downloads/tvm/python/tvm/relay/expr_functor.py", line 56, in visit
    res = self.visit_tuple(expr)
  File "/home/prototype/Downloads/tvm/python/tvm/relay/backend/graph_runtime_codegen.py", line 194, in visit_tuple
    ref = self.visit(field)
  File "/home/prototype/Downloads/tvm/python/tvm/relay/expr_functor.py", line 46, in visit
    res = self.visit_call(expr)
  File "/home/prototype/Downloads/tvm/python/tvm/relay/backend/graph_runtime_codegen.py", line 286, in visit_call
    res = self.visit(arg)
  File "/home/prototype/Downloads/tvm/python/tvm/relay/expr_functor.py", line 46, in visit
    res = self.visit_call(expr)
  File "/home/prototype/Downloads/tvm/python/tvm/relay/backend/graph_runtime_codegen.py", line 276, in visit_call
    self.target[call_dev_type])
  File "/home/prototype/Downloads/tvm/python/tvm/relay/backend/compile_engine.py", line 100, in lower
    raise RuntimeError(msg)
RuntimeError: Traceback (most recent call last):
  File "/home/prototype/Downloads/tvm/python/tvm/relay/backend/compile_engine.py", line 92, in lower
    return _backend._CompileEngineLower(self, key)
  File "/home/prototype/Downloads/tvm/python/tvm/_ffi/_ctypes/function.py", line 206, in __call__
    raise get_last_ffi_error()
  File "/home/prototype/Downloads/tvm/python/tvm/_ffi/base.py", line 297, in get_last_ffi_error
    if err_type.startswith("tvm.error."):
AttributeError: 'NoneType' object has no attribute 'startswith'
Error during compile func
--------------------------
v0.0.1
%1 = fn (%p0: Tensor[(4,), int64], %p1: int64, __dict__=meta[StrMap][0]) -> int64 {
  %0 = take(%p0, %p1, axis=0) // ty=int64
  %0
}
%1

thanks for help~

1 Like

hi ,i also encounter this problem, look forward to solving the problem

this PR will solve it https://github.com/dmlc/tvm/pull/3230