Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create loggers property for Trainer and LightningModule #11683

Merged
Merged
Show file tree
Hide file tree
Changes from 54 commits
Commits
Show all changes
63 commits
Select commit Hold shift + click to select a range
2883fb2
Implement logger property, replace self.logger
akashkw Feb 1, 2022
3ae0a6e
Fix small bugs
akashkw Feb 1, 2022
6d917a4
Fixed initalization bug
akashkw Feb 1, 2022
30d3a57
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Feb 1, 2022
62c9335
Fix for case where creating a LoggerCollection of size 1
akashkw Feb 1, 2022
99a4d98
Better fix for the LoggerCollection of size 1 issue
akashkw Feb 1, 2022
01287ba
Change trainer.loggers from a property to an instance variable
akashkw Feb 1, 2022
41cbab0
Revert all instances of trainer.loggers being used
akashkw Feb 1, 2022
d78d98f
Use logger param to initialize trainer.loggers
akashkw Feb 1, 2022
7f809ff
Remove unneeded newlines
akashkw Feb 1, 2022
179f3d4
Implement unit test for loggers property
akashkw Feb 1, 2022
1ad542a
make trainer.loggers by default an empty list
akashkw Feb 1, 2022
c5df0ad
Update changelog
akashkw Feb 1, 2022
9d564d8
fix unit test according to suggestions
akashkw Feb 1, 2022
9d7c1bf
Update CHANGELOG.md
akashkw Feb 1, 2022
d263659
Remove unnecessary Trainer params
akashkw Feb 2, 2022
befad11
Remove tmpdir parameter for unit test
akashkw Feb 2, 2022
4773c26
Write setters for logger and loggers
akashkw Feb 2, 2022
8871c36
Unit test for setters
akashkw Feb 2, 2022
65ae649
Fix bug where logger setter is called twice
akashkw Feb 2, 2022
df992de
Fix initialization bug with trainer test
akashkw Feb 2, 2022
5e47ef4
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Feb 2, 2022
d55f626
Get rid of extra DummyLogger assignment
akashkw Feb 2, 2022
3debee7
Merge branch 'refactor/create-loggers-property' of github.com:akashkw…
akashkw Feb 2, 2022
29778b1
flake and mypy fixes
akashkw Feb 2, 2022
506e5fd
Flake fix did not commit properly
akashkw Feb 2, 2022
144169b
Small changes based on suggestions
akashkw Feb 2, 2022
bdcbcfb
Shorten setters and update unit test
akashkw Feb 2, 2022
d42ce90
Move unit test to a new file
akashkw Feb 2, 2022
4001efc
flake and mypy fixes
akashkw Feb 2, 2022
9401fb7
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Feb 2, 2022
e2be787
Refactor setter to handle special case of size 1 LoggerCollection
akashkw Feb 3, 2022
1e71c40
Remove DummyLogger changes
akashkw Feb 3, 2022
4418f10
Merge branch 'master' into refactor/create-loggers-property
akashkw Feb 3, 2022
c96534b
Commit suggestion to change None to []
akashkw Feb 3, 2022
6a6e2ef
Merge branch 'master' into refactor/create-loggers-property
akashkw Feb 3, 2022
7fef6fd
Decouple both setters for readability
akashkw Feb 3, 2022
f2a598f
Fix merge conflicts
akashkw Feb 3, 2022
a669a98
Fix tiny bug in trainer.loggers setter
akashkw Feb 3, 2022
b6d6fba
Merge branch 'master' into refactor/create-loggers-property
akashkw Feb 3, 2022
8714857
Add loggers property to lightning.py
akashkw Feb 3, 2022
a7a47f1
update changelog
akashkw Feb 3, 2022
15fa585
Add logger property to docs
akashkw Feb 3, 2022
3f195be
Fix typo
akashkw Feb 3, 2022
c944f9a
update trainer.rst
akashkw Feb 3, 2022
e3312d5
correct spacing
akashkw Feb 3, 2022
88b3c24
remove typing for now
akashkw Feb 3, 2022
89f5037
Merge branch 'master' into refactor/create-loggers-property
akashkw Feb 3, 2022
85256e7
Fix jit unused issue
akashkw Feb 3, 2022
7ab3683
Fix underlines in docs
akashkw Feb 3, 2022
c51f336
Updates based on suggestions
akashkw Feb 4, 2022
ec99acd
More updates to docs based on suggestions
akashkw Feb 4, 2022
ba09e27
Create unit test for lightningmodule loggers property
akashkw Feb 4, 2022
bc6fd72
Replace Mock with Trainer
akashkw Feb 4, 2022
f5b492d
Update types
akashkw Feb 4, 2022
8800ec8
Remove list cast
akashkw Feb 4, 2022
5852ee2
Remove unit test for unsupported behavior
akashkw Feb 4, 2022
0752fb5
Handle special case for setter
akashkw Feb 4, 2022
46b5ae1
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Feb 4, 2022
3480c38
Update docs/source/common/lightning_module.rst
akashkw Feb 9, 2022
cd90c34
Refactor docs and trainer according to suggestions
akashkw Feb 9, 2022
cd43e25
Resolve merge conflicts
akashkw Feb 9, 2022
338ba43
Update unit tests with new behavior
akashkw Feb 9, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
7 changes: 7 additions & 0 deletions CHANGELOG.md
Expand Up @@ -80,6 +80,13 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).

- Added a `MisconfigurationException` if user provided `opt_idx` in scheduler config doesn't match with actual optimizer index of its respective optimizer ([#11247](https://github.com/PyTorchLightning/pytorch-lightning/pull/11247))


- Added a `loggers` property to `Trainer` which returns a list of loggers provided by the user ([#11683](https://github.com/PyTorchLightning/pytorch-lightning/pull/11683))


- Added a `loggers` property to `LightningModule` which retrieves the `loggers` property from `Trainer` ([#11683](https://github.com/PyTorchLightning/pytorch-lightning/pull/11683))


- Added support for DDP when using a `CombinedLoader` for the training data ([#11648](https://github.com/PyTorchLightning/pytorch-lightning/pull/11648))


Expand Down
11 changes: 11 additions & 0 deletions docs/source/common/lightning_module.rst
Expand Up @@ -983,6 +983,17 @@ The current logger being used (tensorboard or other supported logger)
# the particular logger
tensorboard_logger = self.logger.experiment

loggers
~~~~~~~

The list of loggers currently being used.

.. code-block:: python

def training_step(self, batch, batch_idx):
# List of LightningLoggerBase objects
akashkw marked this conversation as resolved.
Show resolved Hide resolved
self.loggers
akashkw marked this conversation as resolved.
Show resolved Hide resolved
akashkw marked this conversation as resolved.
Show resolved Hide resolved

local_rank
~~~~~~~~~~~

Expand Down
18 changes: 15 additions & 3 deletions docs/source/common/trainer.rst
Expand Up @@ -1741,9 +1741,21 @@ The current logger being used. Here's an example using tensorboard

.. code-block:: python

def training_step(self, batch, batch_idx):
logger = self.trainer.logger
tensorboard = logger.experiment
logger = trainer.logger
tensorboard = logger.experiment


loggers (p)
***********
akashkw marked this conversation as resolved.
Show resolved Hide resolved

The list of loggers currently being used.
akashkw marked this conversation as resolved.
Show resolved Hide resolved

.. code-block:: python

# List of LightningLoggerBase objects
loggers = trainer.loggers
for logger in loggers:
logger.log_metrics({"foo": 1.0})


logged_metrics
Expand Down
9 changes: 8 additions & 1 deletion pytorch_lightning/core/lightning.py
Expand Up @@ -37,6 +37,7 @@
from pytorch_lightning.core.mixins import DeviceDtypeModuleMixin, HyperparametersMixin
from pytorch_lightning.core.optimizer import LightningOptimizer
from pytorch_lightning.core.saving import ModelIO
from pytorch_lightning.loggers import LightningLoggerBase
from pytorch_lightning.trainer.connectors.logger_connector.fx_validator import _FxValidator
from pytorch_lightning.utilities import (
_IS_WINDOWS,
Expand Down Expand Up @@ -80,6 +81,7 @@ class LightningModule(
"global_rank",
"local_rank",
"logger",
"loggers",
"model_size",
"automatic_optimization",
"truncated_bptt_steps",
Expand Down Expand Up @@ -252,10 +254,15 @@ def truncated_bptt_steps(self, truncated_bptt_steps: int) -> None:
self._truncated_bptt_steps = truncated_bptt_steps

@property
def logger(self):
def logger(self) -> Optional[LightningLoggerBase]:
"""Reference to the logger object in the Trainer."""
return self.trainer.logger if self.trainer else None

@property
def loggers(self) -> Optional[List[LightningLoggerBase]]:
akashkw marked this conversation as resolved.
Show resolved Hide resolved
"""Reference to the loggers object in the Trainer."""
return self.trainer.loggers if self.trainer else []

def _apply_batch_transfer_handler(
self, batch: Any, device: Optional[torch.device] = None, dataloader_idx: int = 0
) -> Any:
Expand Down
30 changes: 29 additions & 1 deletion pytorch_lightning/trainer/trainer.py
Expand Up @@ -563,7 +563,8 @@ def __init__(
self.__init_profiler(profiler)

# init logger flags
self.logger: Optional[LightningLoggerBase]
self._logger: Optional[LightningLoggerBase]
self._loggers: List[LightningLoggerBase]
akashkw marked this conversation as resolved.
Show resolved Hide resolved
self.logger_connector.on_trainer_init(logger, flush_logs_every_n_steps, log_every_n_steps, move_metrics_to_cpu)

# init debugging flags
Expand Down Expand Up @@ -2475,6 +2476,33 @@ def _active_loop(self) -> Optional[Union[FitLoop, EvaluationLoop, PredictionLoop
Logging properties
"""

@property
def logger(self) -> Optional[LightningLoggerBase]:
return self._logger

@logger.setter
def logger(self, new_logger: Optional[LightningLoggerBase]) -> None:
akashkw marked this conversation as resolved.
Show resolved Hide resolved
self._logger = new_logger
akashkw marked this conversation as resolved.
Show resolved Hide resolved
if not new_logger:
self._loggers = []
elif isinstance(new_logger, LoggerCollection):
self._loggers = list(new_logger)
else:
self._loggers = [new_logger]

@property
def loggers(self) -> List[LightningLoggerBase]:
return self._loggers

@loggers.setter
def loggers(self, new_loggers: Optional[Iterable[LightningLoggerBase]]) -> None:
akashkw marked this conversation as resolved.
Show resolved Hide resolved
akashkw marked this conversation as resolved.
Show resolved Hide resolved
if new_loggers:
self._loggers = list(new_loggers)
akashkw marked this conversation as resolved.
Show resolved Hide resolved
self._logger = self._loggers[0] if len(self._loggers) == 1 else LoggerCollection(self._loggers)
else:
self._loggers = []
self._logger = None

@property
def callback_metrics(self) -> dict:
return self.logger_connector.callback_metrics
Expand Down
11 changes: 11 additions & 0 deletions tests/core/test_lightning_module.py
Expand Up @@ -76,6 +76,17 @@ def test_property_logger(tmpdir):
assert model.logger == logger


def test_property_loggers(tmpdir):
"""Test that loggers in LightningModule is accessible via the Trainer."""
model = BoringModel()
assert model.loggers == []

logger = TensorBoardLogger(tmpdir)
trainer = Trainer(logger=logger)
model.trainer = trainer
assert model.loggers == [logger]


def test_params_groups_and_state_are_accessible(tmpdir):
class TestModel(BoringModel):
def training_step(self, batch, batch_idx, optimizer_idx):
Expand Down
89 changes: 89 additions & 0 deletions tests/trainer/properties/test_loggers.py
@@ -0,0 +1,89 @@
# Copyright The PyTorch Lightning team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

from pytorch_lightning import Trainer
from pytorch_lightning.loggers import LoggerCollection, TensorBoardLogger
from tests.loggers.test_base import CustomLogger


def test_trainer_loggers_property():
"""Test for correct initialization of loggers in Trainer."""
logger1 = CustomLogger()
logger2 = CustomLogger()

# trainer.loggers should be a copy of the input list
trainer = Trainer(logger=[logger1, logger2])

assert trainer.loggers == [logger1, logger2]

# trainer.loggers should create a list of size 1
trainer = Trainer(logger=logger1)

assert trainer.loggers == [logger1]

# trainer.loggers should be an empty list
trainer = Trainer(logger=False)

assert trainer.loggers == []

# trainer.loggers should be a list of size 1 holding the default logger
trainer = Trainer(logger=True)

assert trainer.loggers == [trainer.logger]
assert type(trainer.loggers[0]) == TensorBoardLogger


def test_trainer_loggers_setters():
"""Test the behavior of setters for trainer.logger and trainer.loggers."""
logger1 = CustomLogger()
logger2 = CustomLogger()
logger_collection = LoggerCollection([logger1, logger2])

trainer = Trainer()
assert type(trainer.logger) == TensorBoardLogger
assert trainer.loggers == [trainer.logger]

# Test setters for trainer.logger
trainer.logger = logger1
assert trainer.logger == logger1
assert trainer.loggers == [logger1]

trainer.logger = logger_collection
assert trainer.logger._logger_iterable == logger_collection._logger_iterable
assert trainer.loggers == [logger1, logger2]

trainer.logger = None
assert trainer.logger is None
assert trainer.loggers == []

# Test setters for trainer.loggers
trainer.loggers = [logger1, logger2]
assert trainer.loggers == [logger1, logger2]
assert trainer.logger._logger_iterable == logger_collection._logger_iterable

trainer.loggers = [logger1]
assert trainer.loggers == [logger1]
assert trainer.logger == logger1

trainer.loggers = logger_collection
assert trainer.loggers == [logger1, logger2]
assert trainer.logger._logger_iterable == logger_collection._logger_iterable

trainer.loggers = []
assert trainer.loggers == []
assert trainer.logger is None

trainer.loggers = None
assert trainer.loggers == []
assert trainer.logger is None