I’ve been writing some relay graphs and encountered a confusing issue. I’m not sure if I’m doing something wrong or if this behavior is a bug.
When using a relay constant initialized with an iterable like a numpy or nd array, I get outputs that look like the constant wasn’t properly initialized. Here’s an a snippet of code to that demonstrates the issue.
First, here’s an example that works fine and produces reasonable outputs.
A = relay.var('A', shape=[1, 4], dtype='float32')
B = relay.sum(relay.multiply(A, relay.const(1.)), axis=-1)
B_func = relay.Function([A], B)
with relay.build_config(opt_level=3):
graph, lib, params = relay.build(B_func, 'llvm', params={})
module = graph_runtime.create(graph, lib, tvm.cpu())
module.set_input('A', np.ones(shape=[1, 4]))
module.run()
print(module.get_output(0)
This produces an output of 4 as expected. However, changing the constant to wrap [1.] instead of 1. gives a nonsense output.
A = relay.var('A', shape=[1, 4], dtype='float32')
B = relay.sum(relay.multiply(A, relay.const(np.asarray([1.]))), axis=-1)
B_func = relay.Function([A], B)
with relay.build_config(opt_level=3):
graph, lib, params = relay.build(B_func, 'llvm', params={})
module = graph_runtime.create(graph, lib, tvm.cpu())
module.set_input('A', np.ones(shape=[1, 4]))
module.run()
print(module.get_output(0)
The above produces an inconsistent output that appears to be a result of multiplying with uninitialized memory.
I did a quick look through the tests and noticed that there were no cases that use a constant array, so it’s reasonable that this bug could slip by. Any thoughts on why this is happening?