I have a graph which returns the output of ListConstruct. I get the following error when trying to convert to relay that.
Traceback (most recent call last):
....
File "pytorch_to_relay.py", line 125, in compile_
mod, params = relay.frontend.from_pytorch(trace_model, inp_shape)
File "/tvm/python/tvm/relay/frontend/pytorch.py", line 2190, in from_pytorch
outputs, ret_name, convert_map, prelude)
File "/tvm/python/tvm/relay/frontend/pytorch.py", line 2078, in convert_operators
elif operator == "prim::ListConstruct" and _should_construct_dynamic_list(op_node):
File "/tvm/python/tvm/relay/frontend/pytorch.py", line 112, in _should_construct_dynamic_list
if is_used_by_list_add(filter(lambda use: use.user.kind() != "prim::Loop", uses)):
File "/tvm/python/tvm/relay/frontend/pytorch.py", line 85, in is_used_by_list_add
output_type = _get_node_type(use.user)
File "/tvm/python/tvm/relay/frontend/pytorch.py", line 1652, in _get_node_type
assert node.outputsSize() == 1
AssertionError
the node
is a “prim::Return” node whose output will always be 0. Is this an expected behavior when you return a List[Tensor]?