[ONNX]Error when importing onnx model exported by pytorch

Hi,

I’m trying to use pytorch to export the model with onnx format, and import that onnx folder using tvm. An error occurred at nnvm.frontend.from_onnx(onnx_model):

Traceback (most recent call last):
File “test.py”, line 29, in
sym, params = nnvm.frontend.from_onnx(onnx_model)
File “/home/ubuntu/.local/lib/python3.5/site-packages/nnvm-0.8.0-py3.5.egg/nnvm/frontend/onnx.py”, line 974, in from_onnx
sym, params = g.from_onnx(graph, opset)
File “/home/ubuntu/.local/lib/python3.5/site-packages/nnvm-0.8.0-py3.5.egg/nnvm/frontend/onnx.py”, line 829, in from_onnx
op = self._convert_operator(op_name, inputs, attr, opset)
File “/home/ubuntu/.local/lib/python3.5/site-packages/nnvm-0.8.0-py3.5.egg/nnvm/frontend/onnx.py”, line 930, in _convert_operator
sym = convert_map[op_name](inputs, attrs, self._params)
File “/home/ubuntu/.local/lib/python3.5/site-packages/nnvm-0.8.0-py3.5.egg/nnvm/frontend/onnx.py”, line 207, in _impl_v1
channels = _infer_channels(inputs[1], params, True)
IndexError: list index out of range

I’m using pytorch0.4.1, onnx 1.3.0 (installed by pip), and the master branch of dmlc/tvm on github. The model I’m testing with is Resnet50.
I tried to use caffe2.python.onnx.backend to load the onnx file that generated by pytorch, and it worked fine.

Thanks!

hi Eliza~

If it is convenient, you can give me your script, onnx model, i can have a look~

Hi, Hao @hao_lin ,
I also have the exact error message when importing onnx model, I wonder how can I fix this problem? Thank you~