[SOLVED] One_hot output mismatch

This following small example gives different results on different runs. And the results are also not correct.

Am I using it incorrectly?

@jonso @haichen

import tvm
from tvm import relay

data = relay.var("data", shape=(3,), dtype="int32")
out = relay.one_hot(data, tvm.relay.const(1.0, "float32"), tvm.relay.const(0.0, "float32"), depth=3,
        dtype="float32", axis=-1)

print(out)

func = relay.Function([data], out)
mod = tvm.IRModule.from_expr(func)

target = 'llvm'
with relay.build_config(opt_level=3):
    graph, lib, params = relay.build_module.build(mod, target=target, params=None)

from tvm.contrib import graph_runtime
rt_mod = graph_runtime.create(graph, lib, ctx=tvm.cpu(0))

import numpy as np
np_data = np.array([0, 1, 2]).astype("float32")
rt_mod.set_input("data", np_data)
rt_mod.run()
my_o = rt_mod.get_output(0).asnumpy()
print(my_o)

Did you forget to run?

:smiley: I actually forgot. But sadly, the error still exists

First run


[[-2.450818e+18  1.569454e-43  1.569454e-43]
 [ 1.569454e-43 -2.450818e+18  1.569454e-43]
 [ 1.569454e-43  1.569454e-43 -2.450818e+18]]

Second run


[[1.4584890e-19 3.0734807e+32 3.0734807e+32]
 [3.0734807e+32 1.4584890e-19 3.0734807e+32]
 [3.0734807e+32 3.0734807e+32 1.4584890e-19]]

Add rt_mod.set_input(**params) and try again?

Thanks :expressionless: Blunders on my side. Things are working fine now.

It is really tricky and misleading that some constants will be compiled into parameters. In such case, it’s necessary to set params even though the model doesn’t have weights. Probably always bunding params and lib together could avoid such confusion as discussed in [DISCUSS] Module based Model Runtime Interface.

Just saw this message - glad it got resolved :slight_smile: