New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add gradient clipping #1124
Add gradient clipping #1124
Conversation
catalyst/callbacks/optimizer.py
Outdated
@@ -47,6 +48,7 @@ def __init__( | |||
model_key: str = None, | |||
optimizer_key: str = None, | |||
accumulation_steps: int = 1, | |||
grad_clip_params: dict = None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
could you please add grad_clip_fn: Callable
also?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
btw, could you please add the docs for grad_clip_fn
and grad_clip_params
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
and tests :)
they could be placed under catalyst/callbacks/tests
folder
catalyst/callbacks/optimizer.py
Outdated
@@ -47,6 +48,7 @@ def __init__( | |||
model_key: str = None, | |||
optimizer_key: str = None, | |||
accumulation_steps: int = 1, | |||
grad_clip_params: dict = None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
grad_clip_params: dict = None, | |
grad_clip_params: Dict = None, |
catalyst/callbacks/optimizer.py
Outdated
@@ -56,6 +58,8 @@ def __init__( | |||
self.model = None | |||
self.optimizer = None | |||
self.criterion = None | |||
self.grad_clip_fn = None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
self.grad_clip_fn = None | |
self.grad_clip_fn = grad_clip_fn |
catalyst/callbacks/optimizer.py
Outdated
@@ -72,6 +76,11 @@ def __init__( | |||
self._prefix_lr = "lr" | |||
self._prefix_momentum = "momentum" | |||
|
|||
if grad_clip_params is not None: | |||
self.grad_clip_fn = REGISTRY.get(grad_clip_params.pop("fn_name")) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's use self.grad_clip_fn = partial(..., **grad_clip_params)
catalyst/callbacks/optimizer.py
Outdated
@@ -94,6 +103,9 @@ def on_batch_end(self, runner: "IRunner"): | |||
loss = runner.batch_metrics[self.metric_key] | |||
runner.engine.backward_loss(loss, self.model, self.optimizer) | |||
|
|||
if self.grad_clip_fn is not None: | |||
self.grad_clip_fn(self.model.parameters(), **self.grad_clip_params) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
self.grad_clip_fn(self.model.parameters(), **self.grad_clip_params) | |
self.grad_clip_fn(self.model.parameters()) |
btw, could you please also add a note to the |
catalyst/callbacks/optimizer.py
Outdated
@@ -47,6 +48,7 @@ def __init__( | |||
model_key: str = None, | |||
optimizer_key: str = None, | |||
accumulation_steps: int = 1, | |||
grad_clip_params: dict = None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's define grad_clip_fn
during init
as well - it's a much easier way for Python API users
@@ -98,6 +98,9 @@ stages: | |||
optimizer: | |||
_target_: OptimizerCallback | |||
metric_key: *model_loss | |||
grad_clip_fn: clip_grad_value_ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
could you please also add a test for catalyst/tests/teste_finetune
or something similar?
Hello @alxmamaev! Thanks for updating this PR. We checked the lines you've touched for PEP 8 issues, and found: There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻 Comment last updated at 2021-03-24 20:14:52 UTC |
This pull request is now in conflicts. @alxmamaev, could you fix it? 🙏 |
No description provided.