Can customized convolution operator be fused with topi.nn.relu?

Hi to all. I want to write my own convolution operator which is not a standard convolutional operation; other parts of the computation graph will reuse the TOPI op inventory.

However, I don’t know if this customized conv operator can benefit from the operator fusion with topi.nn.relu?

For example, I tested a snippet from the tutorial as follows:

n = 32
Input = tvm.placeholder((n, n), name='Input')
Filter = tvm.placeholder((3, 3), name='Filter')
di = tvm.reduce_axis((0, 3), name='di')
dj = tvm.reduce_axis((0, 3), name='dj')
with tvm.target.create("cuda"):
  Output = tvm.compute(
      (n - 2, n - 2),
      lambda i, j: tvm.sum(Input[i + di, j + dj] * Filter[di, dj], axis=[di, dj]),
      name='Output')
  # Output = topi.nn.conv2d(Input, Filter, strides=1, padding=2, dilation=1)
  C = topi.nn.relu(Output)  
  s = tvm.create_schedule(C.op)
  # topi.generic.schedule_conv2d_nchw(C) which does operator fusion cannot be used
  print(tvm.lower(s, [Input, Filter], simple_mode=True))

As the comment shows, however, the explicit scheduling for operator fusion cannot be used. Is there another way to achieve this goal?

You can try to build a relay model and operator fusion will be a normal optimization pass of it.

Thanks. I tried to use Relay.

x = tvm.relay.var("x", shape=[1, 2], dtype="float32")
y = x + tvm.relay.const(2.)
C = relay.nn.relu(y)
# mod = relay.module.Module.from_expr(C)
mod = relay.Function(relay.analysis.free_vars(C), C)

with tvm.build_config(opt_level=0, dump_ir=True):
  graph, lib, params = relay.build(mod, target=target, target_host=target_host, params={'x':[[1.,2.]]})
  print(graph)
  # print(lib)
  print(params)
  lib.export_library('ops.tar')

However, I tried the above snippet :zipper_mouth_face: just for a test. Then I got a segmentation fault error…

Do you have any ideas?

You need call traverse_inline in your schedule function, which should be similar to schedule_conv2d_nchw