Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Centralize location of rank_zero utility functions #11746

Closed
ananthsub opened this issue Feb 4, 2022 · 0 comments · Fixed by #11747 or #11793
Closed

Centralize location of rank_zero utility functions #11746

ananthsub opened this issue Feb 4, 2022 · 0 comments · Fixed by #11747 or #11793
Labels
Milestone

Comments

@ananthsub
Copy link
Contributor

ananthsub commented Feb 4, 2022

Proposed refactor

The rank_zero_* utilities are currently split between distributed.py and warnings.py
distributed.py
https://github.com/PyTorchLightning/pytorch-lightning/blob/8c07d8bf905e395bbd2142b5df7b185b8e936c41/pytorch_lightning/utilities/distributed.py#L47-L95

https://github.com/PyTorchLightning/pytorch-lightning/blob/8c07d8bf905e395bbd2142b5df7b185b8e936c41/pytorch_lightning/utilities/warnings.py#L23-L51

Proposal: Move all of the rank_zero_only utilities into their own module

Motivation

It is currently impossible to deprecate something from within distributed.py using the standard Lightning conventions without creating a circular import.

Creating a separate module solves this and clarifies dependencies better.

I'm facing this issue in https://github.com/PyTorchLightning/pytorch-lightning/pull/11745/files

Pitch

Create a new module: utilities/rank_zero.py which will house all of these decorators.

Alternatives

Move rank_zero_only utilities from warnings.py back into distributed.py


If you enjoy Lightning, check out our other projects! ⚡

  • Metrics: Machine learning metrics for distributed, scalable PyTorch applications.

  • Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.

  • Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.

  • Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.

  • Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.

cc @justusschock @awaelchli @akihironitta @rohitgr7

@ananthsub ananthsub changed the title Colocate rank_zero utility functions Centralize location of rank_zero utility functions Feb 4, 2022
ananthsub added a commit to ananthsub/pytorch-lightning that referenced this issue Feb 4, 2022
ananthsub added a commit to ananthsub/pytorch-lightning that referenced this issue Feb 5, 2022
@ananthsub ananthsub added this to the 1.6 milestone Feb 6, 2022
akihironitta pushed a commit that referenced this issue Feb 7, 2022
* Centralize rank_zero_only utilities into their own module

Fixes #11746

* PossibleUserWarning

* Update test_warnings.py

* update imports

* more imports

* Update CHANGELOG.md

* Update mlflow.py

* Update cli.py

* Update api_references.rst

* Update meta.py

* add deprecation tests

* debug standalone

* fix standalone tests

* Update CHANGELOG.md
@ananthsub ananthsub linked a pull request Feb 7, 2022 that will close this issue
12 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
1 participant