I use relay.frontend.from_tensorflow,the following error occurred:
File "tvm/python/tvm/relay/frontend/tensorflow.py", in from_tensorflow
mod, params = g.from_tensorflow(graph, layout, shape, outputs)
File "tvm/python/tvm/relay/frontend/tensorflow.py", in from_tensorflow
for node_item in self._nodes[node.name]]
File "tvm/python/tvm/relay/frontend/common.py", in infer_shape
return get_const_tuple(out_type.checked_type.shape)
File "tvm/topi/python/topi/util.py", in infer_shape,in get_const_tuple
elem = tvm.tir.ir_pass.Simplify(elem)
xxxxx(etc.)
TVMError: Check failed:error:can_dispatch(n):NodeFunctor calls un-registered function on type Any
TVM: latest version
Tensorflow:1.14
Model: bert
I print the other information: OP : Where ( len(inputs) == 1, so it’s argwhere )
[
CallNode(
Op(argwhere),
[CallNode(
Op(not_equal),
[CallNode(
Op(full),
[Constant(1),],
relay.attrs.InitOpAttrs(0x98720b8),
[]),
CallNode(
Op(zeros_like),
[CallNode(Op(full),[Constant(1),],relay.attrs.InitOpAttrs(0x98720b8),[])]
relay.attrs.InitOpAttrs(0x98720b8),
[]),
],
(nullptr),[]
)],
(nullptr),
[]
)
]
input mod info:
%0 = full(1,shape=[128],dtype="int64");
free_var % zeros_like: Tensor[(128),int 64]
not_equal(%0, % zeros_like)
input shape:
[128]
I add print in infer_shape function from “tvm/python/tvm/relay/frontend/common.py”:
error out_type.checked_type = Tensor[(?, 1), int32]
out_type.checked_type.shape: [?, 1]
The output shape of argwhere op should be dynamic,it depends on the actual calculation.
So,Is there a problem with the ‘where’ implementation of relay ?