Compute_at not working for reduction

Hello, I’m new to TVM and trying to write a schedule for the following computation, where I am doing two reductions over the last dimension, which works, but results in two loops If I use compute_at, I get TVMError: Check failed: stage.GetAttachSpec()->attach_type == kGroupRoot (4 vs. 1) : Output must be attached at root. What am I missing here? Thanks

S = tvm.var("S")
B = tvm.var("B")

res = tvm.placeholder((S,B), name="residual", dtype="float16")
sln_in = tvm.placeholder((S,B), name="sln_in", dtype="float16")
res_add = res #+ sln_in

red_axis1 = tvm.reduce_axis((0, hidden_size), "reduce1")
red_axis2 = tvm.reduce_axis((0, hidden_size), "reduce2")

x = tvm.compute((S,), lambda i: tvm.sum(res_add[i, red_axis1] /hidden_size, axis=red_axis1), name='mom1')
xx = tvm.compute((S,), lambda i:  tvm.sum(res_add[i, red_axis2] * res_add[i,red_axis2] / hidden_size, axis=red_axis2), name='mom2')

s_skipln = tvm.create_schedule([x.op, xx.op])

s_skipln[xx].compute_at(s_skipln[x],s_skipln[x].op.axis[0]) 
print(tvm.lower(s_skipln, [x, xx, res], simple_mode=True))```