I am running docker+python+spyder
My spyder run just as much as my concurency limit, idk, can someone help me to understand it ?
my docker-compose.py
celery:
build:
context: .
dockerfile: ./celery-queue/Dockerfile
entrypoint: celery
command: -A tasksSpider worker --loglevel=info --concurrency=5 -n myuser@%n
env_file:
- .env
depends_on:
- redis
My spider code :
def spider_results_group():
results = []
def crawler_results(signal, sender, item, response, spider):
results.append(item)
dispatcher.connect(crawler_results, signal=signals.item_passed)
process = CrawlerProcess(get_project_settings())
process.crawl(groupSpider)
process.start() # the script will block here until the crawling is finished
process.stop()
return results
With this code, i could run spider multiple times, but only 5 times, when i check it, i think this is because my concurency is only 5, and when this run again(6th), it stuck..
if need other code, please ask