[Solved] [AutoTVM] XGBTuner training error

Hello,

I use AutoTVM to search schedule for gemm operator, but encountered the following error. The error happened at the training of cost model after 64 iterations. The reproducible script is at https://gist.github.com/icemelon9/f8493ff3ec1d4d82b597d2e363c041ec.

Environment:
OS: Ubuntu 18.04
Python: 3.7
XGBoost: 0.81
TVM: upstream

Error log:

Traceback (most recent call last):
  File "error.py", line 70, in <module>
    callbacks=[autotvm.callback.log_to_file(logfile)])
  File "/home/ubuntu/repo/tvm/python/tvm/autotvm/tuner/xgboost_tuner.py", line 70, in tune
    super(XGBTuner, self).tune(*args, **kwargs)
  File "/home/ubuntu/repo/tvm/python/tvm/autotvm/tuner/tuner.py", line 134, in tune
    self.update(inputs, results)
  File "/home/ubuntu/repo/tvm/python/tvm/autotvm/tuner/model_based_tuner.py", line 250, in update
    self.cost_model.fit(self.xs, self.ys, self.plan_size)
  File "/home/ubuntu/repo/tvm/python/tvm/autotvm/tuner/xgboost_cost_model.py", line 165, in fit
    x_train = self._get_feature(xs)
  File "/home/ubuntu/repo/tvm/python/tvm/autotvm/tuner/xgboost_cost_model.py", line 301, in _get_feature
    ret[i, :] = fea_cache[ii]
ValueError: could not broadcast input array from shape (312) into shape (336)

Relied by @eddie

There is a current limitation of the XGBoost cost model if itervar features are used and the loop nests can change with different template configurations. Try using curve features instead.

Using curve features resolves the problem.

Hello! I got the same error even when I use ‘curve’. I resolved it by using ‘knob’ though. I see the doc says ‘itervar’ is more accurate and ‘curve’ is more general, and I think they’re preferred to ‘knob’ in my task. Is there gonna be any fix for this?