hello! thanks for the useful tutorial and exciting package. I was able to run the example code successfully and wanted to see how far I’d be able to take this using a large language model. I tried a few ONNX variants of llms all with the same result, a crash on Model->CreateModelInstance with the following error:
Assertion failed: false [ORTExceptionHandler.cpp] [Line: 16] ONNXRuntime threw an exception with code 6, e.what(): "Exception during initialization: D:\build++UE5\Sync\Engine\Plugins\Experimental\NNERuntimeORTCpu\Source\ThirdParty\onnxruntime\Onnxruntime\Private\core\optimizer\initializer.cc:31 onnxruntime::Initializer::Initializer !model_path.IsEmpty() was false. model_path must not be empty. Ensure that a path is provided when the model is created or loaded. ".
The code I have is the exact same I use for the mnist model example:
if (ManuallyLoadedModelData)
{
TWeakInterfacePtr<INNERuntimeCPU> Runtime = UE::NNE::GetRuntime<INNERuntimeCPU>(FString("NNERuntimeORTCpu"));
if (Runtime.IsValid())
{
ModelHelper = MakeShared<FMyModelHelper>();
TUniquePtr<UE::NNE::IModelCPU> Model = Runtime->CreateModel(ManuallyLoadedModelData);
if (Model.IsValid())
{
ModelHelper->ModelInstance = Model->CreateModelInstance(); // engine hard crashes here
I am assuming the error is a side effect of something going wrong with either the import of the model or something about it not being supported. Wondering if this is something that is expected to work at all before sinking too much time into it, and whether any documentation exists about the current limitations/constraints relevant to supported ONNX models (e.g. supported operators, opset versions, model size, etc). Any pointers appreciated, thanks!