Hello, I am interested in using the NNE API.
I’m not sure if I understand correctly, the NNE implements model inference using the onnxruntime inference framework. I currently have a question: at the inference step which is ModelInstance->RunSync(Inputs, Outputs), is there any way to replace with another inference framework, such as MNN which is a lightweight framework? OR, besides supporting the ONNX model format, what other model file formats does the NNE API support?