I'm trying to override some settings for a crawler being called in a script, but these settings seems not to take effect:
from scrapy import log
from scrapy.crawler import CrawlerProcess
from scrapy.utils.project import get_project_settings
from someproject.spiders import SomeSpider
spider = SomeSpider()
overrides = {
'LOG_ENABLED': True,
'LOG_STDOUT': True,
}
settings = get_project_settings()
settings.overrides.update(overrides)
log.start()
crawler = CrawlerProcess(settings)
crawler.install()
crawler.configure()
crawler.crawl(spider)
crawler.start()
And in the spider:
from scrapy.spider import BaseSpider
class SomeSpider(BaseSpider):
def __init__(self):
self.start_urls = [ 'http://somedomain.com' ]
def parse(self, response):
print 'some test' # won't print anything
exit(0) # will normally exit failing the crawler
By defining LOG_ENABLED
and LOG_STDOUT
, I expect to see the "some test" string being printed in the log. Also, I can't seem to redirect the log to a LOG_FILE
among some other settings I've tried.
I must be doing something wrong... Please help. =D