Is there a setitem operations for tensors in relay?

I want to have an operation that is analogue to strided_slice, but creates a new array with the “sliced” content replaced. Something like:

a = [1, 2, 3]
new_a = strided_setitem(a, 1:2, [4])
# new_a = [1, 4, 3]

I don’t mind composing it from multiple existing relay operators, but the best thing I can come up with now would be very complex at best, not to mention extremely inefficient.

Is there already such an operation that I missed? If not, does anyone have pointers on how to implement it?

It is impossible. Relay tensor is immutable by design. The closest thing you can have is to either create a new Tensor, or use Reference to wrap it.

I get that tensors are immutable, what I want is to return a new Tensor that has the modified content.

There is just no simple way to do so right now.

Looks like an primitive op we could add as a relay op, can you do that by tvm.if_then_else and that if the index is in bound?

Ok, so it’s not currently possible. I don’t have the time to do it right away, but I’ll get to it.

I have to dig down and figure out all the details to adding a relay op anyway for other things.


As an update, I have a first shot that is only lightly tested here:


  • reasonably confident that the compute part in topi works correctly,
  • slightly confident that I registered all the right things for topi (schedule, …),
  • not very confident that the relay part is ok and has all the right things registered.

If any of you have some feedback on things I’m doing wrong or ok, that would be great, especially on the relay part.

I now have tests for the TOPI part and it works. The schedule might still be bad, but I figured injective is good for what I do.

The relay part is not working, because it appears I missed some elements (TOpPattern) to register. I’ll try to figure that out later. Still willing to accept comments on what’s there.

Now I need some help.

I fixed a couple of issues with the Relay part but I’m stumped with this error:

E           tvm._ffi.base.TVMError: Traceback (most recent call last):
E             [bt] (8) 9   libtvm.dylib                        0x000000011c044d15 tvm::relay::ScheduleGetter::Create(tvm::relay::Function const&) + 1717
E             [bt] (7) 8   libtvm.dylib                        0x000000011c0461cc tvm::relay::ScheduleGetter::VisitExpr(tvm::relay::Expr const&) + 252
E             [bt] (6) 7   libtvm.dylib                        0x000000011c04932d tvm::relay::ExprFunctor<tvm::Array<tvm::Tensor, void> (tvm::relay::Expr const&)>::VisitExpr(tvm::relay::Expr const&) + 157
E             [bt] (5) 6   libtvm.dylib                        0x000000011c049691 tvm::NodeFunctor<tvm::Array<tvm::Tensor, void> (tvm::runtime::ObjectRef const&, tvm::relay::ExprFunctor<tvm::Array<tvm::Tensor, void> (tvm::relay::Expr const&)>*)>::operator()(tvm::runtime::ObjectRef const&, tvm::relay::ExprFunctor<tvm::Array<tvm::Tensor, void> (tvm::relay::Expr const&)>*) const + 305
E             [bt] (4) 5   libtvm.dylib                        0x000000011c04b0b8 tvm::relay::ExprFunctor<tvm::Array<tvm::Tensor, void> (tvm::relay::Expr const&)>::InitVTable()::'lambda4'(tvm::runtime::ObjectRef const&, tvm::relay::ExprFunctor<tvm::Array<tvm::Tensor, void> (tvm::relay::Expr const&)>*)::__invoke(tvm::runtime::ObjectRef const&, tvm::relay::ExprFunctor<tvm::Array<tvm::Tensor, void> (tvm::relay::Expr const&)>*) + 24
E             [bt] (3) 4   libtvm.dylib                        0x000000011c0478b6 tvm::relay::ScheduleGetter::VisitExpr_(tvm::relay::CallNode const*) + 2566
E             [bt] (2) 3   libtvm.dylib                        0x000000011c04bdf7 tvm::runtime::TypedPackedFunc<tvm::Array<tvm::Tensor, void> (tvm::Attrs const&, tvm::Array<tvm::Tensor, void> const&, tvm::relay::Type const&, tvm::Target const&)>::operator()(tvm::Attrs const&, tvm::Array<tvm::Tensor, void> const&, tvm::relay::Type const&, tvm::Target const&) const + 199
E             [bt] (1) 2   libtvm.dylib                        0x000000011c04c1fe tvm::Array<tvm::Tensor, void> tvm::runtime::TVMRetValue::AsObjectRef<tvm::Array<tvm::Tensor, void> >() const + 846
E             [bt] (0) 1   libtvm.dylib                        0x000000011b938649 dmlc::LogMessageFatal::~LogMessageFatal() + 57
E             File "/Users/anakha/ext/tvm/include/tvm/packed_func_ext.h", line 205
E           TVMError: Check failed: ObjectTypeChecker<TObjectRef>: :Check(ptr): Expected type List[Tensor] but get Tensor

I have no idea why that happens or what I did wrong.

If you want to reproduce it download the branch above and run the tvm/tests/python/relay/ This happens for all the configurations.

So I fixed my last problems and now I have a PR for this: