Skip to content

Commit

Permalink
Fix warning for gradient_checkpointing (huggingface#13767)
Browse files Browse the repository at this point in the history
  • Loading branch information
sgugger authored and Alberto committed Jan 13, 2022
1 parent c3ae681 commit 9969d6d
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/transformers/configuration_utils.py
Expand Up @@ -332,7 +332,7 @@ def __init__(self, **kwargs):
self.transformers_version = kwargs.pop("transformers_version", None)

# Deal with gradient checkpointing
if kwargs.get("gradient_checkpointing", True):
if kwargs.get("gradient_checkpointing", False):
warnings.warn(
"Passing `gradient_checkpointing` to a config initialization is deprecated and will be removed in v5 "
"Transformers. Using `model.gradient_checkpointing_enable()` instead, or if you are using the "
Expand Down

0 comments on commit 9969d6d

Please sign in to comment.