[Relay][Frontend] Stack multiple LSTMBlockCell in Tensorflow

Hello everyone,

I’m trying to port LSTMs on TVM, to auto-tune them. I followed this example to successfully port one LSTM Cell in TVM.

I tried to add a second LSTMBlockCell, following the first one, with:

g2, (out_m2, out_m3) =
tf.contrib.rnn.LSTMBlockCell(num_hidden, forget_bias=forget_bias)(g, (out_m0, out_m1))

The output node is now ‘root/lstm_cell_1/LSTMBlockCell’, and the inputs stay the same. I change ‘num_layers’ to 2. The model is successfully compiled in TVM, but an error arises during

for e, i in zip(input_node, input_data):
m.set_input(e, tvm.nd.array(i))

Warning: cannot find "root/lstm_cell/LSTMBlockCell_c" among input

The problem probably comes from this part of tensorflow frontend, where the wrong layer_name is passed to _LSTMBlockCellLayer._impl (‘root/lstm_cell_1/LSTMBlockCell’ instead of ‘root/lstm_cell/LSTMBlockCell’). Also, where is the loop on ‘num_layers’?

Am I doing something wrong? The frontend code seems ready to accept stacked LSTMBlockCell, could you tell me the correct way to do it please? Classical TF ways (MultiRNNCell) is not supported by Relay conversion, right? There are mentions of it in the tensorflow frontend testing.

Thanks guys
@srkreddy1238

1 Like