Hi,
I’m trying to run a simple network written in Relay interface. (wrapper
is relay.testing.layers
)
batch_size = 128
image_shape = (3,224, 224)
num_classes=1008
data_shape = (batch_size,) + image_shape
data = relay.var("data", shape=data_shape, dtype=dtype)
feature = wrapper.conv2d(data=data, channels=64, kernel_size=(11,11), strides=(4,4), padding=(0,0), name="conv1")
feature = relay.nn.bias_add(feature, relay.var("conv1_bias"))
feature = relay.nn.relu(data=feature)
feature = relay.nn.max_pool2d(data=feature, pool_size=(5, 5), strides=(1, 1), padding=(2,2))
flatten = relay.nn.batch_flatten(data=feature)
fc8 = wrapper.dense_add_bias(data=flatten, units=num_classes, name="fc8")
args = relay.analysis.free_vars(fc8)
return relay.Function(args, fc8)
I’m getting the following error. I suspect this is due to some dimension mismatch, and I could verify that the error is being raised by “fc8” dense_add_bias
layer’s codegen. But I’m not certain why this happens. Maybe someone familiar with the LLVM codegen could help?
Traceback (most recent call last):
File "run_net.py", line 62, in <module>
graph, lib, params = relay.build_module.build(mod, target=target, params=params)
File "/homes/tharindu/tvm-master/python/tvm/relay/build_module.py", line 207, in build
graph_json, mod, params = bld_mod.build(func, target, target_host, params)
File "/homes/tharindu/tvm-master/python/tvm/relay/build_module.py", line 108, in build
self._build(func, target, target_host)
File "tvm/_ffi/_cython/./function.pxi", line 310, in core.FunctionBase.__call__
File "tvm/_ffi/_cython/./function.pxi", line 245, in core.FuncCall
File "tvm/_ffi/_cython/./function.pxi", line 234, in core.FuncCall3
File "tvm/_ffi/_cython/./base.pxi", line 171, in core.CALL
tvm._ffi.base.TVMError: Traceback (most recent call last):
[bt] (8) /homes/tharindu/tvm-master/build/libtvm.so(tvm::IRFunctor<void (tvm::NodeRef const&, tvm::ir::StmtFunctor<void (tvm::Stmt const&)>*)>::operator()(tvm::NodeRef const&, tvm::ir::StmtFunctor<void (tvm::Stmt const&)>*) const+0x68) [0x2aee143b7098]
[bt] (7) /homes/tharindu/tvm-master/build/libtvm.so(tvm::codegen::CodeGenCPU::VisitStmt_(tvm::ir::For const*)+0x76b) [0x2aee1490fc3b]
[bt] (6) /homes/tharindu/tvm-master/build/libtvm.so(tvm::codegen::CodeGenLLVM::VisitStmt_(tvm::ir::For const*)+0x10b) [0x2aee14921d0b]
[bt] (5) /homes/tharindu/tvm-master/build/libtvm.so(tvm::codegen::CodeGenLLVM::CreateSerialFor(llvm::Value*, llvm::Value*, llvm::Value*, tvm::Var const&, tvm::Stmt const&)+0x487) [0x2aee14921747]
[bt] (4) /homes/tharindu/tvm-master/build/libtvm.so(tvm::IRFunctor<void (tvm::NodeRef const&, tvm::ir::StmtFunctor<void (tvm::Stmt const&)>*)>::operator()(tvm::NodeRef const&, tvm::ir::StmtFunctor<void (tvm::Stmt const&)>*) const+0x68) [0x2aee143b7098]
[bt] (3) /homes/tharindu/tvm-master/build/libtvm.so(tvm::codegen::CodeGenLLVM::VisitStmt_(tvm::ir::Store const*)+0xc0) [0x2aee1491bf60]
[bt] (2) /homes/tharindu/tvm-master/build/libtvm.so(tvm::IRFunctor<llvm::Value* (tvm::NodeRef const&, tvm::ir::ExprFunctor<llvm::Value* (tvm::Expr const&)>*)>::operator()(tvm::NodeRef const&, tvm::ir::ExprFunctor<llvm::Value* (tvm::Expr const&)>*) const+0x68) [0x2aee14900f88]
[bt] (1) /homes/tharindu/tvm-master/build/libtvm.so(tvm::codegen::CodeGenLLVM::VisitExpr_(tvm::ir::Load const*)+0x1c0) [0x2aee1491b8f0]
[bt] (0) /homes/tharindu/tvm-master/build/libtvm.so(dmlc::LogMessageFatal::~LogMessageFatal()+0x22) [0x2aee1429ee22]
File "/homes/tharindu/tvm-master/src/codegen/llvm/codegen_llvm.cc", line 961
TVMError: Check failed: ramp->lanes == t.lanes() (186624 vs. 55552) :
Thanks,