[SOLVED] Is there a way to run TVM C++ inference API on Android

I was trying to use C++ inference API on Android . So i can use my model in Android app.
I converted my model to get deploy.so which is compiled by arm linux gcc.

But there are many troubles. Two main troubles below.
The deploy.so depend on several standard C++ libraries, such as libstdc++.so.6, libc.so.6 and so on.
I copied these libraries into jniLibs. But there i got the first main trouble. Only the library whose name has a “.so” extension could be copy into Android application’s native library directory, otherwise it could not be linked. So i modified the library name and its needed libraries’ name so that all the libraries needed have a “.so” extension.
After the extension problem have been solved, I got a reloc error,it says "dlopen failed:unknown reloc type 17 @0xc98cc6dc(2424) ". This is the second main trouble.

I use gcc-linaro-4.8-2015.06-x86_64_arm-linux-gnueabi to build deploy.so. The code is lib.export_library(libpath, cc="/data/proj/FaceLandmark/tvm/gcc-linaro-4.8-2015.06-x86_64_arm-linux-gnueabi/bin/arm-linux-gnueabi-g++").
Use Android Studio 3 to build Android application, toolchain is arm-linux-androideabi-4.9.

I notice the two toolchains’ version is different, because i built tvm with gcc4.8 and the Android Studio‘s default toolchain is arm-linux-androideabi-4.9. I don’t know whether the different version cause to the reloc error. I think the extension method above is ugly and I am afraid of another series of errors , I have spent too much time on those troubles. So i give up the experiment and come here begging for help.

Is there any practical way i can use TVM C++ inference API on Android ?

Thanks for your reply.

target = tvm.target.create(‘llvm -device=arm_cpu -target=arm-linux-androideabi -mattr=+neon -mfloat-abi=soft’)

you can try to use ndk to export the lib:

lib.export_library(path_lib, ndk.create_shared, options=["-shared", “-fPIC”, “-mfloat-abi=softfp”, “-mfpu=neon”])

1 Like

Thank you very much !
This solved my problem.