You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for the Barracuda GPU support. I've successfully inferenced float32 onnx model onto Hololens 2 using barracuda without any hassle but the inference was too high due to complex architecture so thought of using int8 models to achieve low inference time. Actually, I found that, barracuda implemented int8 model support from 0.7.0 release and I am trying to inference my onnx int8 model using barracuda 3.0 latest release but unfortunately Unity wasn't able to recognize the model and the following error appeared when copying the model into project folder. Any insights would be great!
OnnxImportException: Unknown type QuantizeLinear encountered while parsing layer data_quantized.
Thanks for the Barracuda GPU support. I've successfully inferenced float32 onnx model onto Hololens 2 using barracuda without any hassle but the inference was too high due to complex architecture so thought of using int8 models to achieve low inference time. Actually, I found that, barracuda implemented int8 model support from 0.7.0 release and I am trying to inference my onnx int8 model using barracuda 3.0 latest release but unfortunately Unity wasn't able to recognize the model and the following error appeared when copying the model into project folder. Any insights would be great!
OnnxImportException: Unknown type QuantizeLinear encountered while parsing layer data_quantized.
Unity.Barracuda.ONNX.ONNXModelConverter.Err (Unity.Barracuda.Model model, System.String layerName, System.String message, System.String extendedMessage, System.String debugMessage) (at Library/PackageCache/com.unity.barracuda@3.0.0/Barracuda/Runtime/ONNX/ONNXModelConverter.cs:3434)
Unity.Barracuda.ONNX.ONNXModelConverter.ConvertOnnxModel (Onnx.ModelProto onnxModel) (at Library/PackageCache/com.unity.barracuda@3.0.0/Barracuda/Runtime/ONNX/ONNXModelConverter.cs:2954)
Unity.Barracuda.ONNX.ONNXModelConverter.Convert (Google.Protobuf.CodedInputStream inputStream) (at Library/PackageCache/com.unity.barracuda@3.0.0/Barracuda/Runtime/ONNX/ONNXModelConverter.cs:170)
Unity.Barracuda.ONNX.ONNXModelConverter.Convert (System.String filePath) (at Library/PackageCache/com.unity.barracuda@3.0.0/Barracuda/Runtime/ONNX/ONNXModelConverter.cs:98)
Unity.Barracuda.ONNXModelImporter.OnImportAsset (UnityEditor.Experimental.AssetImporters.AssetImportContext ctx) (at Library/PackageCache/com.unity.barracuda@3.0.0/Barracuda/Editor/ONNXModelImporter.cs:65)
UnityEditor.Experimental.AssetImporters.ScriptedImporter.GenerateAssetData (UnityEditor.Experimental.AssetImporters.AssetImportContext ctx) (at :0)
The text was updated successfully, but these errors were encountered: