Skip to content

Commit

Permalink
[FSDP()][Easy] Make fully_shard() only FULL_SHARD (pytorch#88260)
Browse files Browse the repository at this point in the history
We can have a separate API for each of the other sharding strategies.
Pull Request resolved: pytorch#88260
Approved by: https://github.com/mrshenli
  • Loading branch information
awgu authored and pytorchmergebot committed Nov 3, 2022
1 parent fc743ec commit 35be73d
Showing 1 changed file with 1 addition and 2 deletions.
3 changes: 1 addition & 2 deletions torch/distributed/_composable/fully_shard.py
Expand Up @@ -30,7 +30,6 @@
def fully_shard(
module: nn.Module,
process_group: Optional[dist.ProcessGroup] = None,
sharding_strategy: Optional[ShardingStrategy] = None,
mixed_precision: Optional[MixedPrecision] = None,
cpu_offload: Optional[CPUOffload] = None,
auto_wrap_policy: Optional[Callable] = None,
Expand All @@ -51,7 +50,7 @@ def fully_shard(
forward_prefetch_limit = 1
state = _init_core_state(
state,
sharding_strategy,
ShardingStrategy.FULL_SHARD,
mixed_precision,
cpu_offload,
limit_all_gathers,
Expand Down

0 comments on commit 35be73d

Please sign in to comment.