How to count learned model parameters #2985
Unanswered
addisonklinke
asked this question in
Q&A
Replies: 1 comment 1 reply
-
It should be easy to add a utility for this. I think we can utilize the Differentiable tag to count the learnable parameters. You can learn more about this tag here: https://github.com/onnx/onnx/blob/master/docs/DefineDifferentiability.md |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Ask a Question
Question
I would like to count to total number of learned (i.e. trainable) parameters in a model given its ONNX file
Further information
Relevant Area: model usage, best practices, operators
Is this issue related to a specific model? No, this is a generic question about ONNX graphs
Notes
In Pytorch you can do
Since the
.onnx
file of a model includes its weights, it seems that there should be a similar method to analyze a loaded model. I understand that therequires_grad
attribute would no longer be present since a loaded ONNX model is not associated with a training. However, if possible, it would still be useful to distinguish between total parameters (that contribute to FLOPs) and learnable parameters (that were updated via backprop during training)Are there an existing class methods or best practices to fill in the
...
sections above? I am open to non-Python approachesBeta Was this translation helpful? Give feedback.
All reactions