I am trying to do Multiprocessing
of my spider
. I know CrawlerProcess
runs the spider in a single process.
I want to run multiple times the same spider with different arguments.
I tried this but doesn't work.
How do I do multiprocessing?
Please do help. Thanks.
from scrapy.utils.project import get_project_settings
import multiprocessing
from scrapy.crawler import CrawlerProcess
process = CrawlerProcess(settings=get_project_settings())
process.crawl(Spider, data=all_batches[0])
process1 = CrawlerProcess(settings=get_project_settings())
process1.crawl(Spider, data=all_batches[1])
p1 = multiprocessing.Process(target=process.start())
p2 = multiprocessing.Process(target=process1.start())
p1.start()
p2.start()