Replies: 1 comment
-
Hi @nukelash,
It's the exporter which generates the ONNX model, e.g. PyTorch. Sometimes, the exporter just doesn't try its best to figure out the shapes of each tensor. But don't worry, each ONNX importer can take care of that.
You can try
Not sure how you exported the model, but anyway, it's the exporter. |
Beta Was this translation helpful? Give feedback.
-
Hi, I am fairly new to Onnx and am trying to use the C++ onnxruntime to run inference on a model trained in pytorch. I've been running into issues with dimensions of -1 in my output tensors, and while investigating that, noticed the dimensions of my output on Netron aren't "explicit". In other words, all of my input nodes show something like "name: input_spec type: float32[1,1,3,241,2]" whereas my output nodes say "name: output_spec type: float32[ScatterNDoutput_spec_dim_0,ScatterNDoutput_spec_dim_1,ScatterNDoutput_spec_dim_2,ScatterNDoutput_spec_dim_3,ScatterNDoutput_spec_dim_4]" (ScatterND is the node right before the output node). However when I inspect other onnx models such as resnet, these output dimensions are explicit (save for batch size N)
My model was not exported with dynamic axis, and as far as I understand, it shouldn't because all of my inferences will be single batch, fixed sized input, and fixed sized output.
My questions are:
Thanks in advance!
Beta Was this translation helpful? Give feedback.
All reactions