Introducing TVM.NET for bringing TVM features into .NET world

Proposal:

Here i like to propose TVM.NET which will enable TVM features in the .NET world. TVM.NET provides a .NET Standard binding for TVM. It aims to implement the complete TVM API in C# which allows .NET developers to deploy TVM compiled models with the cross-platform .NET Standard framework.

image image

Motivation:

This introduces TVM stack into .NET world which has rich community where user can build numerous apps taking advantage of TVM compiled models.

Design:

For start i have given a draft hierarchical design.

image

Later on will publish internals of each.

Please refer WIP PR for development progress.

Timeline:

I believe the skeleton might be ready in 3~4 weeks.

2 Likes

Thanks for the proposal, please try to add more discussions as the project comes in shape, mainly wrt to the API choices, project layout, as well as ways to link libtvm(e.g. do via C API ) native

1 Like

Phase 0:

This development phase limits only to TVMCSharp project / module. TVMCSharp will be built as a C sharp Class Library.

TVMCSharp Internals:

Linking or Binding to Native TVM runtime library:

Dotnet framework provides a flexible way to import Native library (TVM runtime) to CSharp workspace, shown as below.

image

There are multiple challenges while following above approach in terms of data forwarding / messaging between CSharp code and Native code, which will be solved categorically in an explicit manner.

API Choices:

I have given some thought about it. I think it would be best in the interest of an APP developer, to keep the Dotnet API equivalent to Python APIs. This will ease for any new user to grasp fast for the usage, as the TVM has many examples illustrated in python. If anyone has other opinion, will be happy to take up. So below i am providing a pseudo code which shows how a typical tvm runtime will be used in Dotnet environment.

@tqchen, @(Whomsoever interested) Request to provide all you expert opinions.

Just to add a note above: Below are some open points which i am still brainstorming, to find best approach.

  1. Native library need to be managed either by code or project config. Currently static, fixed only to Ubuntu compiled lib.

  2. Whether to upload a precompiled (TVM native) lib for a particular target env to cloud, and download during build. Or compile it along with dotnet build.

Any suggestion is welcome!

I haven’t look into your implementation, but from he TVMCSharp internal diagram, seems there is something that are not needed. In particular wrt to the PackedFuncManager(and how much ata marchsaling your are proposing).

I am not a CSharp expert, but from my POV it seems likely that we can just call into the external C API in the same shared memory space, as in the case of java, go and rust

As an example for @ANSHUMAN.TRIPATHY for just using the C API, I generated the C# p/invoke (using this lib). Since it’s autogen’d don’t rely on it 100%, some of it will be wrong!

p/invoke code like above would then be wrapped with C# classes to hide away all the p/invoke smell.

API Choices: I understand the reasoning behind keeping a similarity with the python APIs, but I think you might want to balance that with being a .NET API & conventions.

Consider:

IReadOnlyList<NDArray> outputs = runtime.Outputs;

or

var output = runtime.Outputs[0];

or

var output = runtime.GetOutput(0);

vs

var output = runtime.get_output(0);

The java impl has a nice balance of java conventions and discoverable tvm api.

In any case, these are just my opinions :slight_smile:

1 Like

@jmorrill, @tqchen: Thanks for your valuable feedbacks!

The PR is updated now.

Welcome for review! @jmorrill, @tqchen, @jonso, @yzhliu , @(Whomsoever interested).

Thank you!

I can probably give more in depth review, but here’s some suggestions with a quick look:

IDisposable - On classes such as Module or NDArray, I would implement the IDisposable interface to be consistent with dotnet, allowing usage with the using statement.

   ~Module()
   {
       /* If Dispose was not called before, clean up in the finalizer */
       Dispose();
   }

    public void Dispose()
    {
      /* Tell gc to not run finalizer.  
         Saves this object from potentially being moved to higher gc generations */
      GC.SuppressFinalize(this)
      if (!UIntPtr.Zero.Equals(moduleHandle))
      {
         /* ... */
      }
    }

I would even suggest a step further and implement the disposable pattern, possibly as a base class (ex TVMDisposable), and overriding the “protected virtual void Dispose(bool disposing)” shown in the previous link.

Exceptions - Don’t be shy about throwing exceptions. They provide valuable information and hard for developers to ignore :). For example in Runtime.cs, the line: graphJsonString = Utils.ReadStringFromFile(runtimeParam.graphJsonPath); (Utils.ReadStringFromFile returns null) should throw an exception to caller as not being able to read that is indeed something “exceptional”. I would also throw in cases where you have things like if (!isInstantiated) { Console.WriteLine("Not instantiated yet!"); return; }

For the “C” functions that return error codes, I would suggest throwing on those, possibly with an Exception subclass. Maybe “public class TVMException : Exception” and subclassing for individual exceptions? Then adding a helper method like:

internal static ThrowIfError(int tvmErrorCode, string message) 
{
   switch(tvmErrorCode)
   {
      case TVMErrorCode.SomethingBadHappened:
      throw new TvmSomethingBadException(message);
      break;
      case ...: /* all other exceptions */
  }
}

Then usage like this where you do a p/invoke: ThrowIfError(TVMFuncFree(ptr));

Native Library There is a public const string libName = "tvmlibs/libtvm_runtime.so"; On Windows this DLL would be called tvm_runtime.dll. With dotnet core, if you just specify “tvm_runtime”, on linux, dotnet will automatically search for “libtvm_runtime.so”, in the hosts library search paths (and LD_LIBRARY_PATHS) and on windows dotnet will automatically search for “tvm_runtime.dll” according to Windows’ library path resolution (current directory, then everything in PATH). I’m not sure if this the behavior you were looking for or not, but food for thought.

Conventions Probably my most trivial suggestion :). I would have some convention for private class level variables to easily distinguish them from local function variables. It makes parsing the code with your eyes a bit easier. In the project’s C++ code they use an underscore postifx “myVariable_”. In dotnet, its common for class level variables to have an underscore prefix, like “_myVariable”. There’s other conventions, but my suggestion is to have one.

I may have more suggestions, but I’m hungry :slight_smile:. I know your code is early in implementation and some of these suggestions might be something you already planned, but I’ve taken an interest as we do a lot of dotnet at work and this will be useful.

1 Like

@jmorrill: Thank you very much for your insightful comments.

It was a miss from my end to update the To-Do List in PR. I have done it now. Please check once.

I have planned for the points you raised except one:

–:> Native library search config. As i do have plan for Windows target as well. This point i have added in my previous note as well in this RFC.

I think it will be better if i take up this one post this PR merge. Please let me know, if you think otherwise.

I have taken note for your suggestions (to avoid future miss :slight_smile:) as below:

1–:> IDisposable support

2–:> Exception (Already in my to-do list)

3–:> Convention (I missed it during rework, thanks for pointing out, will handle definitely)

Thanks again! :+1:

@tqchen: I seek your expert advice for below point. Please help!

This is regarding NDArray Design.

As in this feature i am trying to bridge between TVM.NET runtime(Managed Memory) and TVM C runtime(Unmanaged Memory) .

Below is a snapshot of NDArray Class Design which i have currently.

In case of Runtime Inference input it is okay, as its lifetime is short, so it will be auto destroyed when no longer in use and so forth the unmanaged memory encompassed.

My concern here is in the case where DotNet wants to retain the NDArray for a longer time for future use. For example when fetch the inference output and store it for further analysis or tasks.

So i want to have an API to detach the NDArray(Managed) from NDArray(UNmanaged), as the unmanaged memory is no longer required.

I have proposal for 2 APIs as below:

  1. –> NDArray().Detach() --> Once Get output from TVM runtime, and keep it only for read only purpose.
  2. –> NDArray().Attach() --> When User wants to Feedback the same NDArray into the Runtime.

Please share your thoughts in this.

Thanks you!

Others may confirm if this is correct, but I believe they implement this in the Java version by keeping track if the managed NDArray is the owner of the native handle, using that to determine if it can release it.

Here, I believe it’s the “isView” bool.

Maybe something similar can automatically be set if the managed NDArray is runtime created vs user created?

Sorry. Actually, I may have been mistaken on the use of the isView after a closer look at the Java impl and the test code here: https://github.com/apache/incubator-tvm/blob/1dcf8a16ee3a93dff5ffc1ad1a66892eda03ef13/jvm/core/src/test/java/org/apache/tvm/contrib/GraphRuntimeTest.java

Gentle ping @tqchen !

Let us keep the API connsist with the rest APIs(java, js python)

When an NDArray is returned, we just keep it as a strong reference(no explicit attachment) The isView is only used for very limited cases (e.g. in a callback where we only want a weakref)

1 Like