In arm android device (cpu: 660), the tvm use about 400% cpu usage to do inference, is that means tvm set thread number = 4?
How could I set thread number to 1 to reduce the cpu usage while do inference in android arm devices?
In arm android device (cpu: 660), the tvm use about 400% cpu usage to do inference, is that means tvm set thread number = 4?
How could I set thread number to 1 to reduce the cpu usage while do inference in android arm devices?