We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
transformers
I am trying to run layoutlmv2. When I run the code from documentation:
from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("microsoft/layoutlmv2-base-uncased")
I get the below error:
KeyError Traceback (most recent call last) in () 1 from transformers import AutoTokenizer, AutoModel 2 ----> 3 tokenizer = AutoTokenizer.from_pretrained("microsoft/layoutlmv2-base-uncased") 4 5 model = AutoModel.from_pretrained("microsoft/layoutlmv2-base-uncased") /usr/local/lib/python3.7/dist-packages/transformers/models/auto/tokenization_auto.py in from_pretrained(cls, pretrained_model_name_or_path, *inputs, **kwargs) 532 if config_tokenizer_class is None: 533 if not isinstance(config, PretrainedConfig): --> 534 config = AutoConfig.from_pretrained(pretrained_model_name_or_path, **kwargs) 535 config_tokenizer_class = config.tokenizer_class 536 /usr/local/lib/python3.7/dist-packages/transformers/models/auto/configuration_auto.py in from_pretrained(cls, pretrained_model_name_or_path, **kwargs) 450 config_dict, _ = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs) 451 if "model_type" in config_dict: --> 452 config_class = CONFIG_MAPPING[config_dict["model_type"]] 453 return config_class.from_dict(config_dict, **kwargs) 454 else: KeyError: 'layoutlmv2'
KeyError Traceback (most recent call last) in () 1 from transformers import AutoTokenizer, AutoModel 2 ----> 3 tokenizer = AutoTokenizer.from_pretrained("microsoft/layoutlmv2-base-uncased") 4 5 model = AutoModel.from_pretrained("microsoft/layoutlmv2-base-uncased")
/usr/local/lib/python3.7/dist-packages/transformers/models/auto/tokenization_auto.py in from_pretrained(cls, pretrained_model_name_or_path, *inputs, **kwargs) 532 if config_tokenizer_class is None: 533 if not isinstance(config, PretrainedConfig): --> 534 config = AutoConfig.from_pretrained(pretrained_model_name_or_path, **kwargs) 535 config_tokenizer_class = config.tokenizer_class 536
/usr/local/lib/python3.7/dist-packages/transformers/models/auto/configuration_auto.py in from_pretrained(cls, pretrained_model_name_or_path, **kwargs) 450 config_dict, _ = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs) 451 if "model_type" in config_dict: --> 452 config_class = CONFIG_MAPPING[config_dict["model_type"]] 453 return config_class.from_dict(config_dict, **kwargs) 454 else:
KeyError: 'layoutlmv2'
The text was updated successfully, but these errors were encountered:
@NielsRogge
Sorry, something went wrong.
Hello @nurgel! LayoutLM v2 is not merged yet so it isn't available in the latest version. You can follow the development here #12604
No branches or pull requests
Environment info
transformers
version: 4.9.2Information
I am trying to run layoutlmv2. When I run the code from documentation:
I get the below error:
The text was updated successfully, but these errors were encountered: