I have a question that How to get the total number of batch iteration from pytorch dataloader?
The following is a common code for training
for i, batch in enumerate(dataloader):
Then, is there any method to get the total number of iteration for the "for loop"?
In my NLP problem, the total number of iteration is different from int(n_train_samples/batch_size)...
For example, if I truncate train data only 10,000 samples and set the batch size as 1024, then 363 iteration occurs in my NLP problem.
I wonder how to get the number of total iteration in "the for-loop".
Thank you.