How To split >2GB ONNX model with external data? #5407
Unanswered
lukolszewski
asked this question in
Q&A
Replies: 2 comments
-
Did you try converting the model data to be external, as described here ? |
Beta Was this translation helpful? Give feedback.
0 replies
-
I want answer for this question too. I have a model which is more than 10GB. How to break the model into multiple parts? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have a large (substantially larger than 2GB) ONNX model with external data files. I've been having trouble with what I thought would be a pretty simple task. Slice a part of that model at a node. The resulting "small" model is also over 2GB BTW.
Does anyone know any way this can be done without converting .onnx binary file format to txt with Protocol Buffers and then spending ages editing the resulting txt file in an editor by hand?
I've tried onnx.util.extract_model, it fails with:
ValueError: Message onnx.ModelProto exceeds maximum protobuf size of 2GB: 28869931499
I tried with check_model = False with same result.
I tried a python tool that has been written to bypass the 2GB limitation of onnx.util.extract_model named sne4onnx, but it fails with the exact same error message. I suspect it would allow me to work with models over 2GB, but the chunk being sliced has to be under 2GB.
So, how do people split large (substantially larger than 2GB) ONNX models?
Beta Was this translation helpful? Give feedback.
All reactions