Hi all,
I have a question to consult. I found the inference result of my model is wrong after TVM auto-tuning.
My model used transpose_a in _mx_batch_dot which is not supported in tvm/relay/frontend/mxnet.py So I changed the operator in mxnet.py
Here is the change I did
def _mx_batch_dot(inputs, attrs):
assert len(inputs) == 2
a, b = inputs
transpose_a = attrs.get_bool("transpose_a", False)
transpose_b = attrs.get_bool("transpose_b", False)
if transpose_a is True:
#msg = 'Value {} in attribute "transpose_a" of operator batch_dot ' \
# 'is not valid.'
#raise tvm.error.OpAttributeInvalid(msg.format(transpose_a))
a = _op.transpose(a, axes=[0, 2, 1]) #Lucien add
if transpose_b is False:
b = _op.transpose(b, axes=[0, 2, 1])
return _op.nn.batch_matmul(a, b)
Will this change effect auto tuning so the inference result is wrong?
Regards
Lucien