New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AttributeError: 'NoneType' object has no attribute 'from_pretrained' #8864
Comments
Same here a couple of hours ago |
from transformers import AutoTokenizer
AutoTokenizer.from_pretrained("Helsinki-NLP/opus-mt-en-fr") which works on v4.0.0 and on |
Putting a better error message in #8881. |
Right, I was using AutoTokenizer.from_pretrained("Helsinki-NLP/opus-mt-en-fr") Thanks, It looks that previously the tokenizer outputted torch tensors and now lists. Is this intended? It breaks existing code. |
Yes, this was a bug. Tokenizers are framework-agnostic and should not output a specific framework's tensor. The implementation of the Marian tokenizer was not respecting the API in that regard. Tokenizers can still handle torch tensors, you need to specify that you want them though: tokenizer(xxx, return_tensors="pt") I guess in your situation it has to do with the tokenizer.prepare_seq2seq_batch(xxx, return_tensors="pt") |
Thanks! |
This code was working yesterday but doesn't work today:
The text was updated successfully, but these errors were encountered: