You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As shown in the below's screenshot, 710144 is my total number of samples, but 100 is the batch amount. Since my batch size is 64, so I expect the total number is 710144/64=11096, I think 11096 should be at the position of 710144. Can someone explain me this? Which makes me a little bit confused.
Bug description
As shown in the below's screenshot, 710144 is my total number of samples, but 100 is the batch amount. Since my batch size is 64, so I expect the total number is 710144/64=11096, I think 11096 should be at the position of 710144. Can someone explain me this? Which makes me a little bit confused.
This is the way I log during training step:
self.log('train_loss', loss, on_step=True, rank_zero_only=True)'
Thanks in advance!
JJ
The text was updated successfully, but these errors were encountered: