I have multiple Scrapy spiders that I need to run at the same time every 5 minutes. The issue is that they take almost 30 sec to 1 minute to start.
It's seem that they all start their own twisted engine, and so it take a lot of time.
I've look into different ways to run multiple spiders at the same time (see Running Multiple spiders in scrapy for 1 website in parallel?), but I need to have a log for each spider and a process per spider to integrate well with Airflow.
I've look into scrapyd, but it doesn't seem to share a twisted engine for multiple spiders, is that correct ?
Are they different ways I could achieve my goals ?