Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement len in IterableDatasetShard #13780

Merged
merged 1 commit into from Sep 28, 2021
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
7 changes: 7 additions & 0 deletions src/transformers/trainer_pt_utils.py
Expand Up @@ -772,6 +772,13 @@ def __iter__(self):
for i in process_slice:
yield current_batch[i]

def __len__(self):
# Will raise an error if the underlying dataset is not sized.
if self.drop_last:
return len(self.dataset) // self.num_processes
else:
return math.ceil(len(self.dataset) / self.num_processes)


# In order to keep `trainer.py` compact and easy to understand, place any secondary PT Trainer
# helper methods here
Expand Down