[Gradient] Building module out of backward function (gradient pass) fails

I tried to build module out of back ward function by adding below snippet at

incubator-tvm/tests/python/relay/test_op_grad_level2.py

Line 145 in 3e3ccce

np.testing.assert_allclose(grad_weight.asnumpy(), grad_weight_pt, rtol=1e-4, atol=1e-4)

bwd_mod = tvm.ir.IRModule({'main': bwd_func}) with relay.build_config(opt_level=0): graph, lib, params = relay.build(bwd_func, "llvm", None, {})

Here module building fails while Memory Planning. I tried to root cause a bit and here it goes.

The generated IR looks like below

fn (%data: Tensor[(1, 4, 16, 16), float32], %weight: Tensor[(16, 4, 3, 3), float32]) -> (Tensor[(1, 16, 16, 16), float32], (Tensor[(1, 4, 16, 16), float32], Tensor[(16, 4, 3, 3), float32])) { %0 = fn () -> () { () }; let %x: ref(fn () -> ()) = ref(%0); %1 = zeros_like(%data) /* ty=Tensor[(1, 4, 16, 16), float32] */; %2 = ref(%1);`

`

Here above empty function generates IR as

RefCreateNode(FunctionNode([], TupleTypeNode([]), Tuple([]), [], (nullptr)))

here the Tuple is empty (empty storage allocation) and memory planner pass fail with below error.

` tvm. ffi.base.TVMError: Traceback (most recent call last): [bt] (8) /local/mnt/workspace/sivb/work/Apache/incubator-tvm/build/libtvm.so(tvm::relay::GraphPlanMemory(tvm::relay::Function const&)+0x174) [0x7fef7354bae4] [bt] (7) /local/mnt/workspace/sivb/work/Apache/incubator-tvm/build/libtvm.so(tvm::relay::StorageAllocator::Plan(tvm::relay::Function const&)+0x2a3) [0x7fef73551fb3] [bt] (6) /local/mnt/workspace/sivb/work/Apache/incubator-tvm/build/libtvm.so(tvm::relay::StorageAllocaBaseVisitor::Run(tvm::relay::Function const&)+0x39d) [0x7fef7354dedd] [bt] (5) /local/mnt/workspace/sivb/work/Apache/incubator-tvm/build/libtvm.so(tvm::relay::StorageAllocaBaseVisitor::GetToken(tvm::RelayExpr const&)+0xf3) [0x7fef7354d5d3] [bt] (4) /local/mnt/workspace/sivb/work/Apache/incubator-tvm/build/libtvm.so(tvm::relay::ExprVisitor::VisitExpr(tvm::RelayExpr const&)+0x83) [0x7fef735db533] [bt] (3) /local/mnt/workspace/sivb/work/Apache/incubator-tvm/build/libtvm.so(tvm::relay::ExprFunctor<void (tvm::RelayExpr const&)>::VisitExpr(tvm::RelayExpr const&)+0x445) [0x7fef733e62c5] [bt] (2) /local/mnt/workspace/sivb/work/Apache/incubator-tvm/build/libtvm.so(tvm::relay::StorageAllocaBaseVisitor::VisitExpr (tvm::relay::LetNode const*)+0x29) [0x7fef7354fdf9] [bt] (1) /local/mnt/workspace/sivb/work/Apache/incubator-tvm/build/libtvm.so(tvm::relay::StorageAllocaBaseVisitor::GetToken(tvm::RelayExpr const&)+0x17f) [0x7fef7354d65f] [bt] (0) /local/mnt/workspace/sivb/work/Apache/incubator-tvm/build/libtvm.so(dmlc::LogMessageFatal::~LogMessageFatal()+0x32) [0x7fef72da7352] File “/local/mnt/workspace/sivb/work/Apache/incubator-tvm/src/relay/backend/graph_plan_memory.cc”, line 140 TVMError: Check failed: it != token_map_.end():

Thanks in advance

Doesn’t reproduce on latest 0.7 dev version.