How to build runnable wasm on browser?

I would like to deploy my network model on browser and perform inference with WebAssembly on client side and there are several examples that relate to my intention. However, none of these example help me successfully build the app.

First I notice that there is a web app example related webgpu. Unfortunately, I find out it causes an error if the app is running on a PC without GPU. Then I try to modify it into CPU version. Following is the comparison between original version and my version. However, the JS code is still not working and keep showing error message that function signature mismatching during inst.systemLib() function call.

  # Original JS code in index.html
  const inst = new tvmjs.Instance(new WebAssembly.Module(wasmSource), new EmccWASI());
  const gpuDevice = await tvmjs.detectGPUDevice();
  if (gpuDevice === undefined) {
    logger(
      "Cannot find WebGPU device, make sure you use the browser that suports webGPU"
    );
    return;
  }
  inst.initWebGPU(gpuDevice);
  inst.systemLib()
  const graphJson = await (await fetch("./" + network + ".json")).text();
  const synset = await (await fetch("./imagenet1k_synset.json")).json();
  const paramsBinary = new Uint8Array(
    await (await fetch("./" + network + ".params")).arrayBuffer()
  );
  logger("Start to intialize the classifier with WebGPU...");

  # Modified CPU version
  const inst = new tvmjs.Instance(new WebAssembly.Module(wasmSource), new EmccWASI());
  const graphJson = await (await fetch("./" + network + ".json")).text();
  const synset = await (await fetch("./imagenet1k_synset.json")).json();
  const paramsBinary = new Uint8Array(
    await (await fetch("./" + network + ".params")).arrayBuffer()
  );
  ctx = inst.cpu(0);
  const syslib = inst.systemLib(); //! something goes wrong here

Due to the strong connection between rust and wasm, I also try to generate wasm from rust. Referring the recent PR about test code, I successfully compiled my model with llvm -target=wasm32-unknown-unknown --system-lib flag. Yet, the generated wasm executable doesn’t work. Also, even the test_wasm32 isn’t working in wasmtime WebAssembly runtime.

I wonder whether I am on the right direction or if there is any documentation about deploying model on browser.

Based on the info shown, could u elaborate more on these questions?

  • Have you add wasm32-wasi target for rustc?
  • Have you move the tvm runtime static library into LD_LABRARY_PATH?

Please notice that currently there are three kind of targets for WebAssembly: wasm32-unknown-emscripten, wasm32-unknown-unknown and wasm32-wasi. And they are designed for different scenarios.

Thank you for reply. Yes, I have double check the target in .cargo/config and it is wasm32-wasi. I doubt the linking path issue first as well but error still comes out after I add to the path.

To get more specific, when run cargo build on tvm/rust/runtime/tests/test_wasm32, the error is command export ‘__tvm_module_ctx’ is not a function. I guess I have encountered the target mismatch issue previously. If that’s the case rust-lld would display the message similar to libmy_model.a: archive has no index; run ranlib to add one. Should __tvm_module_ctx be defined in main.rs? I have totally no idea why the symbol is missing.

@posutsai From what I know, __tvm_module_ctx is a global variable compiled in lib_testwasm32.a static library, I don’t know if you can find this static library under rust/runtime/tests/test_wasm32/lib folder, if not maybe you didn’t configure LLVM_AR environment variable?

After checking __tvm_module_ctx in libtest_wasm32.a, it occurs to me that there is a mismatch between lib.save in build_test_lib.py line 35 and .cargo/config in the official repo. When I switch the save target to wasm32-wasi everything works fine while executing in wasmtime runtime.

Regarding to my goal, I guess building the test case is only the first step. To successfully deploy model on browser, the possible option is to utilize runtime like @wasmer-wasi to interact between JS and wasm. On the other hand, building a web app with a pipeline simular to webgpu app seems promising as well. @leonwanghui I would like to know if there is any comment or preference from your opinion?

It’s really odd for the mismatch errors because wasmtime should be 100% compatible with wasm32-unknown-unknown. But anyway, it doesn’t matter if you change the target to wasm32-wasi.

If you only need to deploy model on browser, I believe tvm-web module would be the best choice for you. Check this blog for the design details.

I already checked the repo and tried before post the question and it seems not working if the PC has no GPU available. Even though I manually deactivate the GPU part and switch to CPU version just like code snippet in question, it cause error when I try to call inst.systemLib() function. Looks like the error comes from interaction between JS and wasm.

Another reason why I view building model to WebAssembly from rust is a better way is that webgpu is relatively unstable feature for browsers. Even google chrome requires user to activate it manually. At least in my case JS runtime is not compatible when the model is running on a PC without a GPU card.

Anyway, I will try to mimic the example resnet under frontend directory and follow the pipeline just like test_wasm32 example. As for those three types of target, is there any detail spec about the difference among them? or could you give me some hint about their corresponding use case.

The only biggest difference I know is wasm32-wasi is able to provide features like malloc, fread … these kinds of function calls that previously not support in WebAssembly sandbox. Thank you for replying my question. Having these discussion really helps me a lot.

I have tried the pipeline on resnet model and when running in wasmtime runtime seems like there are still some symbol that is undefined for example, env::TVMArrayCopyFromBytes.

Here’s what I’ve hone. When I specify tvm_runtime path in Cargo.toml just like test_wasm32 did.

[dependencies]
ndarray = "0.12"
image = "0.20"
tvm-runtime = { path = "/path/to/tvm/rust/runtime" }
tvm-frontend = { path = "/path/to/tvm/rust/frontend" }

Cargo couldn’t find tvm_runtime library. Thus I manually add these two lines in tvm/rust/runtime/Cargo.toml and tvm/rust/frontend/Cargo.toml as follows.

[lib]
crate-type = ["rlib", "staticlib"]

Run cargo build -target=wasm32-wasi --release in both runtime and frontend directory. After doing this, libtvm_runtime.a and libtvm_frontend.a are successfully generated under tvm/rust/target/wasm32-wasi/release/ and cargo is able to find it after specifying path in build.rs.

println!(
    "cargo:rustc-link-search=native={}/rust/target/wasm32-wasi/release/", 
    env("TVM_HOME")
);

In my case, I also export the model in python script and archive it with llvm-ar-10 rcs libweb_resnet18.a web_resnet18.o. Since cargo doesn’t show any error message related to web_resnet18 library unfind. I assume the archived static file successfully bundle into output wasm file. Unfortunately, when I execute the generated wasm file with wasmtime runtime, the undefined symbol error occurs. According to the previous experience, I already check the symbol in the generated wasm model file.

The most similar symbol in the available static lib including libtvm_runtime.a, libtvm_frontend.a and libweb_resnet18.a is TVMArrayCopyFromTo in libtvm_frontend.a. I doubt that I still lack some lib to build and search in TVM code base with TVMArrayCopyFromBytes keyword. It turns out the definition of the symbol is in src/runtime/ndarray.cc. Does that mean I have to manually build another runtime static lib from c++ file under /src/runtime? Shouldn’t the symbol get included when I build libtvm_runtime.a?

Here are some guidances from wasmtime website: https://bytecodealliance.github.io/wasmtime/wasm-rust.html.

I’m curious why you need to compile libtvm_runtime.a and libtvm_frontend.a these two static libraries. From what I know, wasmtime couldn’t load several .a static libraries at the same time. I suggest you can call tvm_runtime::SysModule to load libweb_resnet18.a, and then compiled the code into test_wasm32.wasm binary file.

Please correct me if I missed something.

Well, I admit compilation of libtvm_runtime.a seems a little bit odd. But without the static runtime library, rustc says the compilation is failed due to lack of tvm_runtime. In addition, I find the building procedure in the tvm-wasm asks user to run cargo build --target wasm32-unknown-unknown --release under runtime directory and I do find the Cargo.toml in tvm-wasm's /tvm/rust/runtime submodule is different from current version in official repo.

I am not sure if I am on the correct path. But it seems relatively right since there’s no undefined symbol during compilation. I do believe tvm official has there own reason to not generate static lib and it should be that complicated to simply deploy a web app. From this perspective, the solution with js runtime looks more promising. Yet, none of these two make me deploy successfully.

By the way, if the description here seems too vague for you, please let me know. I would post more detail about my code here.

I’m not very sure if you are correct, maybe you could take a look at this PR (https://github.com/apache/incubator-tvm/pull/5892) that compiles tvm runtime into wasm library?

I’ve checked the commit you mention above. According to my understanding, the commit focus on giving a example demonstrating how to save customized op and load it until runtime. Despite the use case is slightly different than mine, I believe the building pipeline still works on my case. In conclusion, manually modifying Cargo.toml and build libtvm_runtime.a beyond official pipeline doesn’t make sense.

Anyway according to my recent work, I believe in TVM documentation still lack one simple example just like from_torch illustrating the steps to deploy model on browser. I’ll keep trying to achieve my goal and please let me know if you know there is any example code or solution. Thank you very much.

@posutsai I guess this project (https://github.com/kazum/tvm-wasm) from @kazum would offer some helps.

The steps I mentioned above is following exactly the repo you refer to and unfortunately it didn’t work. Thank you for sharing the information.

I think this is related to the issue of the wasi runtime: [Bug][Web] tvmjs runtime error when loading, missing “runtime.SystemLib” · Issue #7734 · apache/tvm (github.com)