I have multiple scrapy spiders. For every spider I have an own scrapy project like this:
Scrapy Project 1 -> spider 1
Scapys project 2 -> spider 2
When I deploy one project to scrapyd it works fine and says there is one spider. But when I try to deploy the second spider to the same scrapyd-project the spider is overwritten.
To deploy I used this command:
scrapyd-deploy
This is the scrapy-cf file:
[settings]
default = bevboc_De.settings
[deploy]
url = http://localhost:6800/
project = test
And this is the output:
scrapyd-client spiders -p test
test:
www.bevbox.de
So far so good. Now I want to deploy the next spider to the project. Second Scrapy spider cfg-file:
[settings]
default = frankbrauer360.settings
[deploy]
url = http://localhost:6800/
project = test
After deploying the new spider to the test project the old spider is overwritten.
scrapyd-client spiders -p test
test:
www.frankbauer360.de
Am I doing something wrong when deploying or does it not work because my Scrapy spiders are all in a different scrapy project?
I want them to schedule using scraypd, so the spiders can run every 24h automated. Is it possibile to deploy all my projects in one scrapyd project with all my spiders? If not, is it possibile with scrapyd to scheudle multiple projects automated?
I'am new to python, scrapy and scrapyd. If this is a stupid question, I apologize.