@jvin1011 Yes, there are different runtimes with different platform support and also different format support. Unfortunately we do not have any runtime yet that supports the tflite file format. However, there are a number of model converters out there which are able to convert tflite to e.g. onnx (see here for an example) and then you could use existing runtimes.
Of course it is always possible to use any other inference engine directly (also tflite) without going through NNE. The downside is that you have to integrate the libraries yourself, and if it does not run on all target platforms that you are interested then you need to do the same for other runtimes, creating a huge fragmentation in your code. Or. e.g. if you want to use NPUs you will need to add a runtime for each hardware vendor you plan to run on as well.
That is why we started with NNE: Provide you with a single API to be able to access all platforms the same way and be extendible to include future runtimes as well.
So long story short: I would try to export or convert your model to onnx and save yourself the pain to add your own runtimes