Skip to content

Commit

Permalink
Fix warning situation: UserWarning: max_length is ignored when paddin…
Browse files Browse the repository at this point in the history
…g=True" (huggingface#13829)

* Removed wrong warning

* Raise a warning when `max_length` is given with wrong `truncation`

* Update the error message

* Update the warning message

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
  • Loading branch information
2 people authored and Alberto Bégué committed Jan 27, 2022
1 parent b3fc528 commit 93adfe1
Showing 1 changed file with 5 additions and 2 deletions.
7 changes: 5 additions & 2 deletions src/transformers/tokenization_utils_base.py
Expand Up @@ -2223,8 +2223,11 @@ def _get_padding_truncation_strategies(
elif padding is not False:
if padding is True:
if verbose:
if max_length is not None:
warnings.warn("`max_length` is ignored when `padding`=`True`.")
if max_length is not None and (truncation is False or truncation == "do_not_truncate"):
warnings.warn(
"`max_length` is ignored when `padding`=`True` and there is no truncation strategy. "
"To pad to max length, use `padding='max_length'`."
)
if old_pad_to_max_length is not False:
warnings.warn("Though `pad_to_max_length` = `True`, it is ignored because `padding`=`True`.")
padding_strategy = PaddingStrategy.LONGEST # Default to pad to the longest sequence in the batch
Expand Down

0 comments on commit 93adfe1

Please sign in to comment.