Type Inference Error of QLinearConv: inputs are expected to have tensor type #5977
Unanswered
jschauerUdS
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello together,
i am quantizing a .onnx model in Python with the quantize_static() function. After creating an ONNX runtime Inference Session the Error:
" File ~\AppData\Roaming\Python\Python38\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py:419 in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File ~\AppData\Roaming\Python\Python38\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py:463 in _create_inference_session
sess.initialize_session(providers, provider_options, disabled_optimizers)
Fail: [ONNXRuntimeError] : 1 : FAIL : Node (QLinearConv_token_308) Op (QLinearConv) [TypeInferenceError] inputs are expected to have tensor type."
appears.
Does anyone know how to change the input type of a Node or have another idea to solve this problem?
I'm using:
Python: '3.8.10'
ONNX Version: '1.14.1'
ONNX Runtime Version: '1.16.2'
Beta Was this translation helpful? Give feedback.
All reactions