Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add gradient clipping #1124

Merged
merged 7 commits into from Mar 24, 2021
Merged

Add gradient clipping #1124

merged 7 commits into from Mar 24, 2021

Conversation

alxmamaev
Copy link
Contributor

No description provided.

@@ -47,6 +48,7 @@ def __init__(
model_key: str = None,
optimizer_key: str = None,
accumulation_steps: int = 1,
grad_clip_params: dict = None,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could you please add grad_clip_fn: Callable also?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

btw, could you please add the docs for grad_clip_fn and grad_clip_params?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

and tests :)
they could be placed under catalyst/callbacks/tests folder

@@ -47,6 +48,7 @@ def __init__(
model_key: str = None,
optimizer_key: str = None,
accumulation_steps: int = 1,
grad_clip_params: dict = None,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
grad_clip_params: dict = None,
grad_clip_params: Dict = None,

@@ -56,6 +58,8 @@ def __init__(
self.model = None
self.optimizer = None
self.criterion = None
self.grad_clip_fn = None
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
self.grad_clip_fn = None
self.grad_clip_fn = grad_clip_fn

@@ -72,6 +76,11 @@ def __init__(
self._prefix_lr = "lr"
self._prefix_momentum = "momentum"

if grad_clip_params is not None:
self.grad_clip_fn = REGISTRY.get(grad_clip_params.pop("fn_name"))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let's use self.grad_clip_fn = partial(..., **grad_clip_params)

@@ -94,6 +103,9 @@ def on_batch_end(self, runner: "IRunner"):
loss = runner.batch_metrics[self.metric_key]
runner.engine.backward_loss(loss, self.model, self.optimizer)

if self.grad_clip_fn is not None:
self.grad_clip_fn(self.model.parameters(), **self.grad_clip_params)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
self.grad_clip_fn(self.model.parameters(), **self.grad_clip_params)
self.grad_clip_fn(self.model.parameters())

@Scitator
Copy link
Member

btw, could you please also add a note to the Changelog.md?

@mergify mergify bot dismissed Scitator’s stale review March 23, 2021 20:21

Pull request has been modified.

@@ -47,6 +48,7 @@ def __init__(
model_key: str = None,
optimizer_key: str = None,
accumulation_steps: int = 1,
grad_clip_params: dict = None,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let's define grad_clip_fn during init as well - it's a much easier way for Python API users

@@ -98,6 +98,9 @@ stages:
optimizer:
_target_: OptimizerCallback
metric_key: *model_loss
grad_clip_fn: clip_grad_value_
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could you please also add a test for catalyst/tests/teste_finetune or something similar?

@pep8speaks
Copy link

pep8speaks commented Mar 24, 2021

Hello @alxmamaev! Thanks for updating this PR. We checked the lines you've touched for PEP 8 issues, and found:

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2021-03-24 20:14:52 UTC

@mergify
Copy link

mergify bot commented Mar 24, 2021

This pull request is now in conflicts. @alxmamaev, could you fix it? 🙏

@Scitator Scitator merged commit f153d69 into catalyst-team:master Mar 24, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants