Hi Nico, I have exported my ONNX model in FP16 format. When I attempt model inference using the RDGHlsl backend, I encounter the following error.
[2025.05.20-06.53.23:601][ 0]LogNNERuntimeRDGHlsl:
Warning: Input at index '0' (from template T0) is of type
"Half' witch is not suDported for that input.
[2025.05.20-06.53.23:601][ 0]LogNNERuntimeRDGHlsl:
Warning: OperatorRegistryfailed to validate
operator:Slice
[2025.05.20-06.53.23:602][ 0]LogNNERuntimeRDGHlsl:
Warning: Model validator 'RDGModel validator'
detected an error.
[2025.05.20-06.53.23:602][ 0]LogNNERuntimeRDGHlsl:
Warning: Model is not valid.
[2025.05.20-06.53.23:602][ 0]LogNNERuntimeRDGHlsl:
Warning: Cannot create a model fromthe model data
with id B67298DC456900C2B7797DA66DF4EA2F
[2025.05.20-06.53.24:736][0]LogTemp: Error: Could not
create the RDG model
Is FP16 inference not supported in NNE, or is it specifically unsupported with the RDGHlsl backend? The same model when exported in FP32 works fine using RDGHlsl backend but the performance is a little slow, so I’m exploring ways to optimize and improve it.