[Relay] relay.build_module.optimize?

Hi,

is build_module.optimize method useful in some way before compiling a model? I found out some pieces of code used to call this method, but I don’t see it used in any of the tutorial nor I can find much documentation about it.

However I tried to use it and I wasn’t able to compile successfully. I always got back some error like:

Traceback (most recent call last):

  File "prepare_model.py", line 99, in <module>
    main(model)

  File "prepare_model.py", line 86, in main
    graph, lib, params = relay.build_module.build(mod, target, target_host=target_host, params=params)

  File "/tvm/incubator-tvm/python/tvm/relay/build_module.py", line 244, in build
    graph_json, mod, params = bld_mod.build(func, target, target_host, params)

  File "/tvm/incubator-tvm/python/tvm/relay/build_module.py", line 109, in build
    self._build(func, target, target_host)

  File "/tvm/incubator-tvm/python/tvm/_ffi/_ctypes/function.py", line 207, in __call__
    raise get_last_ffi_error()

tvm._ffi.base.TVMError: Traceback (most recent call last):
  [bt] (8) /tvm/incubator-tvm/build/libtvm.so(tvm::relay::ExprMutator::VisitExpr(tvm::relay::Expr const&)+0x96) [0x7f267ad97e56]
  [bt] (7) /tvm/incubator-tvm/build/libtvm.so(tvm::relay::ExprFunctor<tvm::relay::Expr (tvm::relay::Expr const&)>::VisitExpr(tvm::relay::Expr const&)+0x450) [0x7f267ac16320]
  [bt] (6) /tvm/incubator-tvm/build/libtvm.so(tvm::relay::ExprFunctor<tvm::relay::Expr (tvm::relay::Expr const&)>::InitVTable()::{lambda(tvm::runtime::ObjectRef const&, tvm::relay::ExprFunctor<tvm::relay::Expr (tvm::relay::Expr const&)>*)#6}::_FUN(tvm::runtime::ObjectRef const&, tvm::relay::ExprFunctor<tvm::relay::Expr (tvm::relay::Expr const&)>*)+0x27) [0x7f267ac117f7]
  [bt] (5) /tvm/incubator-tvm/build/libtvm.so(tvm::relay::ForwardRewriter::VisitExpr_(tvm::relay::CallNode const*)+0x876) [0x7f267ac70536]
  [bt] (4) /tvm/incubator-tvm/build/libtvm.so(std::_Function_handler<void (tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*), void tvm::runtime::TypedPackedFunc<tvm::relay::Expr (tvm::relay::Call const&, tvm::Array<tvm::relay::Expr, void> const&, tvm::NodeRef const&)>::AssignTypedLambda<tvm::relay::Expr (*)(tvm::relay::Call const&, tvm::Array<tvm::relay::Expr, void> const&, tvm::NodeRef const&)>(tvm::relay::Expr (*)(tvm::relay::Call const&, tvm::Array<tvm::relay::Expr, void> const&, tvm::NodeRef const&))::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}>::_M_invoke(std::_Any_data const&, tvm::runtime::TVMArgs&&, tvm::runtime::TVMRetValue*&&)+0xad) [0x7f267ac93a5d]
  [bt] (3) /tvm/incubator-tvm/build/libtvm.so(tvm::relay::alter_op_layout::AlterOpLayoutRewrite(tvm::relay::Call const&, tvm::Array<tvm::relay::Expr, void> const&, tvm::NodeRef const&)+0xae8) [0x7f267ad1a3a8]
  [bt] (2) /tvm/incubator-tvm/build/libtvm.so(tvm::relay::alter_op_layout::CallInfer(tvm::relay::Call const&, tvm::Array<tvm::Layout, void> const&, tvm::Array<tvm::Layout, void> const&, tvm::Array<tvm::Array<tvm::Expr, void>, void> const&)+0x74) [0x7f267ad18854]
  [bt] (1) /tvm/incubator-tvm/build/libtvm.so(tvm::relay::Op tvm::runtime::Downcast<tvm::relay::Op, tvm::relay::Expr>(tvm::relay::Expr)+0x198) [0x7f267ad1cfa8]
  [bt] (0) /tvm/incubator-tvm/build/libtvm.so(dmlc::LogMessageFatal::~LogMessageFatal()+0x32) [0x7f267a6d8a12]
  File "/tvm/incubator-tvm/include/tvm/runtime/object.h", line 786
TVMError: Check failed: ref->template IsInstance<typename SubRef: :ContainerType>(): Downcast from relay.Function to relay.Op failed.

Does it return a Function and not a Module?

Here’s my code:

import logging

import mxnet as mx
import tvm
from tvm import relay
from tvm.contrib import ndk
import os

target = tvm.target.arm_cpu(model='pixel2')
target_host = 'llvm -device=arm_cpu -model=pixel2 -target=arm64-linux-android -mattr=+neon'

def get_model(model_name, batch_size=1):
    prefix, epoch = model_name, 0
    img_size = 299 if model_name == 'inceptionv3' else 224
    data_shape = (batch_size, 3, img_size, img_size)
    sym, arg_params, aux_params = mx.model.load_checkpoint(prefix, epoch)
    mod, params = relay.frontend.from_mxnet(sym, {"data": data_shape}, arg_params=arg_params, aux_params=aux_params)
    with relay.build_config(opt_level=3):
        mod, params = relay.build_module.optimize(mod, target=target, params=params)
    return mod, params

def main(model_str):
    print(model_str)
    print("getting model...")
    mod, params = get_model(model_str)
    try:
        os.mkdir(model_str)
    except FileExistsError:
        pass
    print("building...")
    print("(relay)")
    with relay.build_config(opt_level=3):
        graph, lib, params = relay.build_module.build(mod, target, target_host=target_host, params=params)
    print("dumping lib...")
    lib.export_library(model_str + '/' + 'deploy_lib_cpu.so', ndk.create_shared)
    print("dumping graph...")
    with open(model_str + '/' + 'deploy_graph.json', 'w') as f:
        f.write(graph)
    print("dumping params...")
    with open(model_str + '/' + 'deploy_param.params', 'wb') as f:
        f.write(relay.save_param_dict(params))

if __name__ == '__main__':
    models = ['resnet-18']
    for model in models:
        main(model)

Any insight guys???

relay.build_module.build call relay.build_module.optimize internally, you need not to call optimize before build

The function call optimize may not be idempotent. If you call ‘optimize’ twice, i guess it fails too. It seems that Relay::pass alter_op_layout is root cause. You can debug the code src/relay/pass/alter_op_layout.cc by yourself.

1 Like

I see, thank you. I don’t think that calling alter_op_layout pass more than once should throw an error though. Maybe it’s something that we can improve?

As far as I know, It throw an error if I only call fuseOps Relay::Pass.

import tvm
import numpy as np
from tvm import relay

shape = (1, 2, 3)
tp = relay.TensorType(shape, "float32")
a=relay.var('1', tp)
b=relay.var('2', tp)
c=a+b
d=a+b
e=c+d
func = relay.Function([a,b], e)

mod = relay.Module({"main": func})
print(a,b,type(c), func, type(func))
#mod=relay.transform.EliminateCommonSubexpr()(mod)
mod=relay.transform.FuseOps()(mod)
mod=relay.transform.PrintIR()(mod)

Code robustly of the Relay::pass need improved.

1 Like

Is this error due-to some pass dependency? I had checked that, if we call ‘SimplifyInference’ pass (level 0 pass) before ‘FuseOps’, the errors goes.