Support for Embedding, Bidirectional and others

I see that as of now, some layers useful for NLP models such as embedding, bidrectional, and others are unsupported by TVM’s frontend. I notice that some of them are commented out in _convert_map in frontend files (like in https://github.com/dmlc/tvm/blob/master/python/tvm/relay/frontend/keras.py).

  1. Is there a reason why these layers are currently unsupported?
  2. Are there plans or development efforts already underway?
  3. I was planning to develop these layers anyway. Could anyone briefly explain how those layers could be incorporated into TVM?

Thanks!

Hi,

I cant really help out with 1. or 2. best advise I can give you is to look at the roadmaps and see if you can pinpoint if they want to expand support to those layers.

Now relating to 3. I would also like to know what is are the steps required to introduce new layers.
I guess a high level answer would be:

  1. Create a compute rule for the operator

  2. Create a scheduling rule for the operator based on some backend (in the simplest case the schedule could be the vanilla one given by the compute rule)

  3. Register the compute rule to nnvm

  4. Register the schedule to tvm given the target backend

My question is where and how exactly in the code base is this to be done?
(In the old nnvm based frontend It seems that most operators are here while in the Relay variant it seems to be here.

I hope a more experience user can help us out :slight_smile:

1 Like