I recently had some problems converting the newly trained FP16 mxnet model into a TVM relay function
First, the basic description is as follows:
- I trained according to the tutorials provided by mxnet
2.The test code is also the official code in the TVM tutorial
The questions are as follows: - When I was testing on the server, I found I couldn’t use the GPU to test
- I found that the parameter types of my conversion output are FP16 and FP32, and most of them are FP32
The error message:
File “/root/workspace/tvm/python/tvm/relay/frontend/mxnet.py”, line 1156, in _update_shape_dtype
“%s: dtype not expected %s vs %s” % (k, dtype, v.dtype))
ValueError: bn_data_gamma: dtype not expected float16 vs float32