Relay Keras ReLU generation is wrong

When I use the AST the IR drops, it is not right for the case: keras.layers.ReLU(max_value=6., threshold=1., negative_slope=0.).

It generates this:

def @main(%input_1: Tensor[(1, 8, 3, 3), float32]) -> Tensor[(1, 8, 3, 3), float32] { %0 = subtract(%input_1, 1f /* ty=float32 /) / ty=Tensor[(1, 8, 3, 3), float32] /; multiply(0f / ty=float32 /, %0) / ty=Tensor[(1, 8, 3, 3), float32] */ }. It should only apply the multiply to values less than 0, but it applies it to all values.