Model can Import by NNVM but failed on Relay

when runing below line:

sym, params = relay.frontend.from_tensorflow(graph_def)

I got a “internal invariant was violdated” error.

Error Msg:
fn () {
free_var %BoxPredictor_0/Shape: Tensor[(4,), int32]
%0 = strided_slice(%BoxPredictor_0/Shape, begin=[0], end=[1], strides=[1]) #
%1 = reshape(%0, newshape=[]) #
%2 = expand_dims(%1, axis=0) #
free_var %BoxPredictor_0/stack_1/1: Tensor[(1,), int32]
%3 = expand_dims(%BoxPredictor_0/stack_1/1, axis=0) #
free_var %BoxPredictor_0/stack_1/2: Tensor[(1,), int32]
%4 = expand_dims(%BoxPredictor_0/stack_1/2, axis=0) #
%5 = (%2, %3, %4)
%6 = concatenate(%5) # an internal invariant was violdated whiletypechecking your program[12:20:32] /Users/jialei/github/tvm/src/relay/op/tensor/transform.cc:188: Check failed: e_ndim == ndim (2 vs. 1) relay.concatenate requires all tensors have the same ndim

The same model has no problem to be compiled by nnvm, if I do:
sym, params = nnvm.frontend.from_tensorflow(graph_def)

I appreciate if anyone can hint me about where should I check to get more information for investigation?

1 Like

Same situations here,
tvm._ffi.base.TVMError: [01:47:41] /homed/lidw/git/DaweiLi/tvm/src/relay/op/tensor/transform.cc:187: Check failed: e_ndim == ndim (2 vs. 1) relay.concatenate requires all tensors have the same ndim

Can you guys check if the same model work with NNVM ?

May be the expand_dim 0d aware patch from below might help for relay.
https://github.com/dmlc/tvm/commit/00d509d480ae55def1abccc61742a163dadfab39#diff-3845514369cc735d96dd63a274a21471

There are some patches to be pulled from NNVM to relay frontend. I am working on it.

Thanks,
SIva

The same model works for NNVM, and we are currently using NNVM.

Looking forward to see Relay can also work!

Thanks!

1 Like

@sol401430

Can you check if the below change fix your problem.