When calculating the total steps, shouldn't we use number of batches * epoch size ? In this case, it would be self.total_steps = (len(train_loader.dataset) // tb_size) * ab_size instead of self.total_steps = (len(train_loader.dataset) // tb_size) // ab_size.
Please fix me if anywhere is wrong.

https://pytorchlightning.github.io/lightning-tutorials/notebooks/lightning_examples/text-transformers.html
cc @Borda @rohitgr7
When calculating the total steps, shouldn't we use
number of batches * epoch size? In this case, it would beself.total_steps = (len(train_loader.dataset) // tb_size) * ab_sizeinstead ofself.total_steps = (len(train_loader.dataset) // tb_size) // ab_size.Please fix me if anywhere is wrong.
https://pytorchlightning.github.io/lightning-tutorials/notebooks/lightning_examples/text-transformers.html
cc @Borda @rohitgr7