Tvm.compute with variable shape fails to build

I’m writing a gradient (backward) DSL for topi/vision/rcnn/roi_align.py. In the _sample method, roi_bin_grid_h and roi_bin_grid_w are computed at runtime when using adaptive size.

This is a snippet of the gradient DSL that is having problem with tvm.build:

import tvm

def test(rois, feature_stride=1, feature_size=2):
  spatial_scale = 1./float(feature_stride)
  roi = rois[0]
  b,w_s,h_s,w_e,h_e = roi[0].astype("int32"),roi[1],roi[2],roi[3],roi[4]
  w_s *= spatial_scale
  h_s *= spatial_scale
  w_e *= spatial_scale
  h_e *= spatial_scale

  roi_h = tvm.max(h_e - h_s, tvm.const(1.0, "float32"))
  roi_w = tvm.max(w_e - w_s, tvm.const(1.0, "float32"))

  bin_h = roi_h / feature_size
  bin_w = roi_w / feature_size

  roi_bin_grid_h = tvm.ceil(roi_h / feature_size).astype('int32')
  roi_bin_grid_w = tvm.ceil(roi_w / feature_size).astype('int32')

  h_ = tvm.compute((roi_bin_grid_h,),lambda i: (0.5+i)*bin_h/roi_bin_grid_h)
  w_ = tvm.compute((roi_bin_grid_w,),lambda i: (0.5+i)*bin_w/roi_bin_grid_w)
  hw_ = tvm.compute((roi_bin_grid_h,roi_bin_grid_w,2),lambda i,j,k: tvm.if_then_else(k==0,h_(i),w_(j)))

  return hw_

roi_pld = tvm.placeholder((1,5), dtype="float32")
output_pld = test(roi_pld)

ctx = tvm.context("llvm", 0)
sch = tvm.create_schedule([roi_pld.op, output_pld.op])

func = tvm.build(sch, [roi_pld, output_pld], "llvm")

This is the error using latest master TVM:


Traceback (most recent call last):

  File "mytest/test_bug.py", line 33, in <module>
    func = tvm.build(sch, [roi_pld, output_pld], "llvm")

  File "/home/k00375917/tvm/python/tvm/build_module.py", line 642, in build
    mhost = codegen.build_module(fhost_all, str(target_host))

  File "/home/k00375917/tvm/python/tvm/codegen.py", line 36, in build_module
    return _Build(lowered_func, target)

  File "/home/k00375917/tvm/python/tvm/_ffi/_ctypes/function.py", line 207, in __call__
    raise get_last_ffi_error()

tvm._ffi.base.TVMError: Traceback (most recent call last):
  [bt] (8) /home/k00375917/tvm/build/libtvm.so(tvm::NodeFunctor<llvm::Value* (tvm::runtime::ObjectRef const&, tvm::ir::ExprFunctor<llvm::Value* (tvm::Expr const&)>*)>::operator()(tvm::runtime::ObjectRef const&, tvm::ir::ExprFunctor<llvm::Value* (tvm::Expr const&)>*) const+0x57) [0x7f44dc38e837]
  [bt] (7) /home/k00375917/tvm/build/libtvm.so(tvm::codegen::CodeGenLLVM::VisitExpr_(tvm::ir::Mul const*)+0x24) [0x7f44dc3a83f4]
  [bt] (6) /home/k00375917/tvm/build/libtvm.so(tvm::NodeFunctor<llvm::Value* (tvm::runtime::ObjectRef const&, tvm::ir::ExprFunctor<llvm::Value* (tvm::Expr const&)>*)>::operator()(tvm::runtime::ObjectRef const&, tvm::ir::ExprFunctor<llvm::Value* (tvm::Expr const&)>*) const+0x57) [0x7f44dc38e837]
  [bt] (5) /home/k00375917/tvm/build/libtvm.so(tvm::codegen::CodeGenLLVM::VisitExpr_(tvm::ir::Max const*)+0x2a) [0x7f44dc3a9b3a]
  [bt] (4) /home/k00375917/tvm/build/libtvm.so(tvm::NodeFunctor<llvm::Value* (tvm::runtime::ObjectRef const&, tvm::ir::ExprFunctor<llvm::Value* (tvm::Expr const&)>*)>::operator()(tvm::runtime::ObjectRef const&, tvm::ir::ExprFunctor<llvm::Value* (tvm::Expr const&)>*) const+0x57) [0x7f44dc38e837]
  [bt] (3) /home/k00375917/tvm/build/libtvm.so(tvm::codegen::CodeGenLLVM::VisitExpr_(tvm::ir::Sub const*)+0x14) [0x7f44dc3a8174]
  [bt] (2) /home/k00375917/tvm/build/libtvm.so(tvm::NodeFunctor<llvm::Value* (tvm::runtime::ObjectRef const&, tvm::ir::ExprFunctor<llvm::Value* (tvm::Expr const&)>*)>::operator()(tvm::runtime::ObjectRef const&, tvm::ir::ExprFunctor<llvm::Value* (tvm::Expr const&)>*) const+0x57) [0x7f44dc38e837]
  [bt] (1) /home/k00375917/tvm/build/libtvm.so(tvm::codegen::CodeGenLLVM::VisitExpr_(tvm::ir::Call const*)+0xe0) [0x7f44dc3a50a0]
  [bt] (0) /home/k00375917/tvm/build/libtvm.so(dmlc::LogMessageFatal::~LogMessageFatal()+0x32) [0x7f44dbc26ff2]
  File "/home/k00375917/tvm/src/codegen/llvm/codegen_llvm.cc", line 1006
TVMError: Unknown call type name= placeholder call_type= 3

I know runtime shape with tvm.var is supported, so I think this is more a bug than a non-supported feature in TVM DSL. Your comments are greatly appreciated.