0

I have multiple scrapy spiders. For every spider I have an own scrapy project like this:

Scrapy Project 1 -> spider 1

Scapys project 2 -> spider 2

When I deploy one project to scrapyd it works fine and says there is one spider. But when I try to deploy the second spider to the same scrapyd-project the spider is overwritten.

To deploy I used this command:

scrapyd-deploy

This is the scrapy-cf file:

[settings]
default = bevboc_De.settings

[deploy]
url = http://localhost:6800/
project = test

And this is the output:

scrapyd-client spiders -p test

test:    
  www.bevbox.de

So far so good. Now I want to deploy the next spider to the project. Second Scrapy spider cfg-file:

[settings]
default = frankbrauer360.settings

[deploy]
url = http://localhost:6800/
project = test

After deploying the new spider to the test project the old spider is overwritten.

scrapyd-client spiders -p test

test:
  www.frankbauer360.de

Am I doing something wrong when deploying or does it not work because my Scrapy spiders are all in a different scrapy project?

I want them to schedule using scraypd, so the spiders can run every 24h automated. Is it possibile to deploy all my projects in one scrapyd project with all my spiders? If not, is it possibile with scrapyd to scheudle multiple projects automated?

I'am new to python, scrapy and scrapyd. If this is a stupid question, I apologize.

CIC3RO
  • 13
  • 4
  • It makes sense that it's overwritten, as you use the project name 'test' for both. Try naming them differently and see if it works. – Wim Hermans Jun 05 '20 at 05:24
  • I created now for every scrapy-project a scrapyd-project and build a batch file to start them all at once. With the batch file it is also possibile to run the spider automated every 24 hours. – CIC3RO Jun 06 '20 at 09:33

0 Answers0