[Relay] Defer knowledge of constants until compile time


Question: Can I set a placeholder in Relay for constants (much the way I can leave variables untyped until compilation time)? I am particularly curious about convolution strides and padding.

I’m looking to port PyTorch to run on relay. The idea is to move a subset of python (realistically TorchScript) into relay’s IR. On function invocation I’d like to determine shapes and constants and either compile the function or get it from cache. Right now I have the specialization of shapes figured out (using expr = relay.Call(expr, [var_with_known_shape]) + relay.ir_pass.infer_type(expr)). However, when generating things like conv the relay IR expects parameters like strides and padding to be known already, which isn’t possible.

Here’s a repro revealing the issue (highlighted in this code)

def conv2d_impl(input, conv_w, conv_stride, conv_pad, bn_mean, bn_var, bn_w, bn_b):
  c = F.conv2d(input, conv_w, stride=conv_stride, padding=conv_pad)
  b = F.batch_norm(c, bn_mean, bn_var, bn_w, bn_b)
  return b