2

Just like the title. I use -s LOG_FILE=mylog.txt for saving an external log. But I would like also see the log as the spider is running. Is there a way to do that? I'm using Windows 10 and prefer an answer that works in Windows 10.

Amateur developer without computing background here so please go easy on me.

siowy
  • 238
  • 1
  • 2
  • 11

1 Answers1

2

Use gnu's tee tool:

scrapy crawl myspider 2>&1 | tee crawl.log

2>&1 redirect stderr to stdout - you want errors and info in the same file most likely.
| tee crawl.log pipes that output to tee which splits it to crawl.log file and stdout.

There's tee implementation for windows too:

There's a Win32 port of the Unix tee command, that does exactly that. See http://unxutils.sourceforge.net/ or http://getgnuwin32.sourceforge.net/

taken from: https://stackoverflow.com/a/796492/3737009

Peter Poon
  • 35
  • 6
Granitosaurus
  • 20,530
  • 5
  • 57
  • 82