Releases: speediedan/finetuning-scheduler
Fine-Tuning Scheduler Feature Teaser Release 2.3.0
Note
Because Lightning is not currently planning an official 2.3.0
release, this FTS release is marked as a pre-release and pins a lightning
2.3.0dev
commit. A return to normal Lightning cadence is expected with 2.4.0
and FTS will release accordingly. Installation of this FTS pre-release can either follow the normal installation from source or use the release archive, e.g.:
export FTS_VERSION=2.3.0 && \
wget https://github.com/speediedan/finetuning-scheduler/releases/download/v${FTS_VERSION}-rc1/finetuning_scheduler-${FTS_VERSION}rc1.tar.gz && \
pip install finetuning_scheduler-${FTS_VERSION}rc1.tar.gz
[2.3.0] - 2024-05-17
Added
- Support for Lightning and PyTorch
2.3.0
- Introduced the
frozen_bn_track_running_stats
option to the FTS callback constructor, allowing the user to override the default Lightning behavior that disablestrack_running_stats
when freezing BatchNorm layers. Resolves#13.
Deprecated
- removed support for PyTorch
1.13
Fine-Tuning Scheduler Patch Release 2.2.4
[2.2.4] - 2024-05-04
Added
- Support for Lightning
2.2.4
and PyTorch2.2.2
Fine-Tuning Scheduler Patch Release 2.2.1
[2.2.1] - 2024-03-04
Added
- Support for Lightning
2.2.1
Fine-Tuning Scheduler Release 2.2.0
[2.2.0] - 2024-02-08
Added
- Support for Lightning and PyTorch
2.2.0
- FTS now inspects any base
EarlyStopping
orModelCheckpoint
configuration passed in by the user and applies that configuration when instantiating the required FTS callback dependencies (i.e.,FTSEarlyStopping
orFTSCheckpoint
). Part of the resolution to #12.
Changed
- updated reference to renamed
FSDPPrecision
- increased
jsonargparse
minimum supported version to4.26.1
Fixed
- Explicitly
rank_zero_only
-guardedScheduleImplMixin.save_schedule
andScheduleImplMixin.gen_ft_schedule
. Some codepaths were incorrectly invoking them from non-rank_zero_only
guarded contexts. Resolved #11. - Added a note in the documentation indicating more clearly the behavior of FTS when no monitor metric configuration is provided. Part of the resolution to #12.
Deprecated
- removed support for PyTorch
1.12
- removed legacy FTS examples
Thanks to the following users/contributors for their feedback and/or contributions in this release:
@Davidham3 @jakubMitura14
Fine-Tuning Scheduler Patch Release 2.1.4
[2.1.4] - 2024-02-02
Added
- Support for Lightning
2.1.4
Changed
- Bumped
sphinx
requirement to>5.0,<6.0
Deprecated
- Removed deprecated lr
verbose
init param usage - Removed deprecated
tensorboard.dev
references
Fine-Tuning Scheduler Release 2.1.3
[2.1.3] - 2023-12-21
Added
- Support for Lightning
2.1.3
Fine-Tuning Scheduler Release 2.1.2
[2.1.2] - 2023-12-20
Added
- Support for Lightning
2.1.2
Fixed
- Explicitly
rank_zero_only
-guardedScheduleImplMixin.save_schedule
andScheduleImplMixin.gen_ft_schedule
. Some codepaths were incorrectly invoking them from non-rank_zero_only
guarded contexts. Resolves #11.
Thanks to the following users/contributors for their feedback and/or contributions in this release:
@Davidham3
Fine-Tuning Scheduler Release 2.1.1
[2.1.1] - 2023-11-08
Added
- Support for Lightning
2.1.1
Note: The latest finetuning-scheduler 2.1.1
release on conda-forge switches to a lightning
dependency (rather than the standalone pytorch-lightning
) to align with the default lightning
framework installation. Installation of FTS via pip within a conda env continues to be the recommended installation approach.
Fine-Tuning Scheduler Release 2.1.0
[2.1.0] - 2023-10-12
Added
- Support for Lightning and PyTorch
2.1.0
- Support for Python
3.11
- Support for simplified scheduled FSDP training with PyTorch >=
2.1.0
anduse_orig_params
set toTrue
- Unified different FSDP
use_orig_params
mode code-paths to support saving/restoring full, consolidated OSD (PyTorch versions >=2.0.0
) - added support for FSDP
activation_checkpointing_policy
and updated FSDP profiling examples accordingly - added support for
CustomPolicy
and new implementation ofModuleWrapPolicy
with FSDP2.1.0
Changed
- FSDP profiling examples now use a patched version of
FSDPStrategy
to avoid omni-us/jsonargparse#337 withjsonargparse
<4.23.1
Fixed
- updated
validate_min_wrap_condition
to avoid overly restrictive validation in someuse_orig_params
contexts - for PyTorch versions < 2.0, when using the FSDP strategy, disabled optimizer state saving/restoration per Lightning-AI/pytorch-lightning#18296
- improved fsdp strategy adapter
no_decay
attribute handling
Deprecated
FSDPStrategyAdapter
now uses theconfigure_model
hook rather than the deprecatedconfigure_sharded_model
hook to apply the relevant model wrapping. See Lightning-AI/pytorch-lightning#18004 for more context regardingconfigure_sharded_model
deprecation.- Dropped support for PyTorch
1.11.x
.
Fine-Tuning Scheduler Patch Release 2.0.9
[2.0.9] - 2023-10-02
- Support for Lightning 2.0.8 and 2.0.9