Hi to all. I want to write my own convolution operator which is not a standard convolutional operation; other parts of the computation graph will reuse the TOPI op inventory.
However, I don’t know if this customized conv operator can benefit from the operator fusion with topi.nn.relu?
For example, I tested a snippet from the tutorial as follows:
n = 32
Input = tvm.placeholder((n, n), name='Input')
Filter = tvm.placeholder((3, 3), name='Filter')
di = tvm.reduce_axis((0, 3), name='di')
dj = tvm.reduce_axis((0, 3), name='dj')
with tvm.target.create("cuda"):
Output = tvm.compute(
(n - 2, n - 2),
lambda i, j: tvm.sum(Input[i + di, j + dj] * Filter[di, dj], axis=[di, dj]),
name='Output')
# Output = topi.nn.conv2d(Input, Filter, strides=1, padding=2, dilation=1)
C = topi.nn.relu(Output)
s = tvm.create_schedule(C.op)
# topi.generic.schedule_conv2d_nchw(C) which does operator fusion cannot be used
print(tvm.lower(s, [Input, Filter], simple_mode=True))
As the comment shows, however, the explicit scheduling for operator fusion cannot be used. Is there another way to achieve this goal?