[TVM][GraphRuntime] How to share parameters between multiple tvm::runtime::Module?

Because TVM GraphRuntime does not provide control-flows, we have to separate our model to two parts. While we need to share parameters between them, to save memory usage. However, there are two issues we need address first,

  1. GraphRuntime will automatically allocate memory when creating the module (GraphRuntime::SetupStorage). How could we specify this entry should be allocated later?

  2. “set_input” will always copy the parameter (DLTensor) into data_entry, we need a set function to accept a NDArray, so the actual storage can be shared. This is relatively easy to add a new PackedFunc.

Could anyone give us some advice? Thanks!

https://github.com/dmlc/tvm/pull/3384 thanks to @ajtulloch

@tqchen thanks for replying, but that pull was to share parameters among multiple instances of the same model. Our requirement is to share parameters between different models. Consider we have a seq2seq model, and we need to separate the model to two modules, encoder and decoder. The encoder and decoder may share parameters in some dense layers.

likely you can use the same mechanism to share the parameters across different models as long as you know their correspondence

@tqchen I created a pull request, please help to review, thanks!