Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

upgrade transformers to 4.13.0 #1659

Merged
merged 22 commits into from Dec 11, 2021
Merged

upgrade transformers to 4.13.0 #1659

merged 22 commits into from Dec 11, 2021

Conversation

julian-risch
Copy link
Member

@julian-risch julian-risch commented Oct 27, 2021

Proposed changes:

The failing test cases are caused by the changes in the following PR in transformers: huggingface/transformers#13873

There is a fix for this one now here on transformers' master branch but no new release yet: huggingface/transformers@24b30d4
However, another problem remains, which is about the scatter method in torch used by TAPAS.
Maybe it can be resolved with:
pip install torch-scatter -f https://data.pyg.org/whl/torch-1.10.0+${CUDA}.html
https://github.com/rusty1s/pytorch_scatter

closes #1658

Status (please check what you already did):

  • First draft (up for discussions & feedback)
  • Final code
  • Added tests
  • Updated documentation

@tholor
Copy link
Member

tholor commented Oct 29, 2021

Good timing - let's move directly to 4.12: https://github.com/huggingface/transformers/releases/tag/v4.12.0

@lalitpagaria
Copy link
Contributor

Good timing - let's move directly to 4.12: https://github.com/huggingface/transformers/releases/tag/v4.12.0

I would not suggest to use ..0 version better to wait for at least 3 patch version. Of the late seeing many bugs in their ..0 version which gets resolved in patch versions. 😅

@julian-risch
Copy link
Member Author

Good timing - let's move directly to 4.12: https://github.com/huggingface/transformers/releases/tag/v4.12.0

I would not suggest to use ..0 version better to wait for at least 3 patch version. Of the late seeing many bugs in their ..0 version which gets resolved in patch versions. 😅

Good point! Luckily there is 4.12.2 already. https://github.com/huggingface/transformers/releases?page=1 Guess why. ;)
I'm still trying out different versions to find out which change causes some problems. Maybe something about changes of the default values of padding.

@lalitpagaria
Copy link
Contributor

I'm still trying out different versions to find out which change causes some problems. Maybe something about changes of the default values of padding.

I see your are performing sequential search. How about binary search? :)

@julian-risch
Copy link
Member Author

I see your are performing sequential search. How about binary search? :)

That would assume that whenever the tests fail there is something that needs to be changed in haystack. However, transformers introduces some bugs from time to time on their side, for example in 4.11.0 (as you mentioned).

What I found out now is that 4.11.3 let's our tests test_table_reader, test_reader and test_extractive_qa_pipeline fail whereas 4.11.1 works well (besides a small error in zero shot classification, which I assume has been fixed on transformers side).

@lalitpagaria
Copy link
Contributor

I see your are performing sequential search. How about binary search? :)

That would assume that whenever the tests fail there is something that needs to be changed in haystack. However, transformers introduces some bugs from time to time on their side, for example in 4.11.0 (as you mentioned).

What I found out now is that 4.11.3 let's our tests test_table_reader, test_reader and test_extractive_qa_pipeline fail whereas 4.11.1 works well (besides a small error in zero shot classification, which I assume has been fixed on transformers side).

No, I mean to speed up finding a rouge version.
We can try the first mid of (4.12.2 and 4.7.0) if build pass then select next version between (mid, 4.12.2) otherwise between (mid, 4.7.0) and so on. :)

@deepset-ai deepset-ai deleted a comment from CLAassistant Nov 17, 2021
@tholor
Copy link
Member

tholor commented Nov 30, 2021

Quick update: PyTorch has been upgraded to 1.10 already in a separate PR (#1789 ). For transformers, we still wait for one bug fix. The fix is already merged, but it didn't make it to their last release, unfortunately. We should be ready to bump the transformers version with their next release.

@julian-risch julian-risch changed the title upgrade to pytorch 1.10 and transformers 4.11.3 upgrade to pytorch 1.10 and transformers 4.13.0 Dec 10, 2021
@julian-risch julian-risch marked this pull request as ready for review December 10, 2021 15:22
@julian-risch julian-risch changed the title upgrade to pytorch 1.10 and transformers 4.13.0 upgrade transformers to 4.13.0 Dec 10, 2021
@tholor tholor merged commit 2c184e4 into master Dec 11, 2021
@tholor tholor deleted the upgrade_pytorch_transformers branch December 11, 2021 11:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Upgrade to latest pytorch & transformers
4 participants