1

I'm using PyTorch Lightning and I call the method seed_everything(), but I don't want to see the INFO logging message

Global seed set to 1234

on every iteration of my main algorithm.

I've tried logging.getLogger('pytorch_lightning').setLevel(logging.ERROR) in the constructor of the PL object, but it doesn't work. Also tried what is suggested in this answer, but it doesn't work.

Thanks

jas0n
  • 21
  • 1
  • 3

2 Answers2

1

This worked for me:

import logging
log = logging.getLogger("pytorch_lightning")
log.propagate = False
log.setLevel(logging.ERROR)
joshwa
  • 1,660
  • 3
  • 17
  • 26
  • After over an hour of trying other suggestions in other StackOverflow questions, this is the only approach that worked for me. To get it to work for a unit test, I had to put all but the import statement into the body of the unit test itself. – Jerry K. Feb 28 '23 at 17:40
  • This no longer worked for me. The reason however is that in newer versions quite a lot of the code moved to the `lightning_fabric` library (which is a dependency of `pytorch_lightning`). So use `"lightning_fabric"` as the logger name above to fix. – Richard Vock Jun 07 '23 at 13:47
1

I also found many suggestions, and I finally found a better way. Although the above answer is valid, it will prohibit the output of other information.

You should put the code seed_everything() before the log initialization. Just like this:

import pytorch_lightning as pl
from pytorch_lightning import seed_everything

seed_everything(1234) 

logger = pl.loggers.TensorBoardLogger(save_dir="logs", name="example")
Shaido
  • 27,497
  • 23
  • 70
  • 73