New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix LayoutLM ONNX test error #13710
Fix LayoutLM ONNX test error #13710
Conversation
Hi, Can you also add LayoutLM to the list of supported models? https://huggingface.co/transformers/serialization.html Thanks! |
Seems good to me! @NielsRogge I think he has already done it, or you're talking about something different? |
Yeah I'm talking about the documentation (.rst file): https://github.com/huggingface/transformers/blob/master/docs/source/serialization.rst It also seems like other models (like mBART) are supported to be exported to ONNX, but they are not mentioned in the docs. I asked @sgugger if we can create an automagically updated table for this. |
@NielsRogge I have added LayoutLM to the list of supported models. :) |
Fix LayoutLM ONNX test error
Fix LayoutLM ONNX test error
Fix LayoutLM ONNX test error
Fix LayoutLM ONNX test error
What does this PR do?
Fixes the batch_size and seq_len computation during ONNX export in configuration_layoutlm.py.
PRs: #13702, #13562
Issue: #13300
Before submitting
Pull Request section?
to it if that's the case.
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@michaelbenayoun @LysandreJik @NielsRogge @mfuntowicz