Why doesn't nn.layer_norm have TOpPattern?

I want to close LayerNormToInferUnpack for layer_norm operator. But when I close the LayerNormToInferUnpack, the TVMError occurs. The error log is " File “/home/zhongzheng.he/Project/stc-tvm/include/tvm/relay/op.h”, line 534 TVMError: Check failed: idx < data_.size() && data_[idx].second != 0: Attribute TOpPattern has not been registered for Operator nn.layer_norm ". So when I add TOpPattern for operator nn.layer_norm, the TVMError does not occur.

This is my patch:

-- a/src/relay/op/nn/nn.cc
+++ b/src/relay/op/nn/nn.cc
@@ -870,6 +870,7 @@ RELAY_REGISTER_OP("nn.layer_norm")
 .add_argument("gamma", "Tensor", "The gamma scale factor.")
 .add_argument("beta", "Tensor", "The beta offset factor.")
 .set_support_level(1)
+.set_attr<TOpPattern>("TOpPattern", kOpaque)
 .add_type_rel("LayerNorm", LayerNormRel);

So, why doesn’t nn.layer_norm have the TOpPattern?

I think the reason is that you typically want to split the op into the statistics gathering and elementwise operations to fuse the parts it with the surrounding ops and having an op prevents that. That said, I don’t think anyone keeps you from changing that, it’s just that the other case (splitting the op) is so much more common that noone thought of it.

I just donot want to unpack the layer_norm operater.