Questions tagged [scrapyd-deploy]

13 questions
3
votes
0 answers

Call to deprecated function retry_on_eintr. retry_on_eintr(check_call, [sys.executable, 'setup.py', 'clean', '-a', 'bdist_egg', '-d', d]

I have to deploy my scrapy project on scrapyd on windows server 2016. I am using the below command to deploy my project scrapyd -deploy local but it generates the following error Call to deprecated function retry_on_eintr. …
3
votes
1 answer

Scrapyd-Deploy: SPIDER_MODULES not found

I am trying to deploy a scrapy 2.1.0 project with scrapy-deploy 1.2 and get this error: scrapyd-deploy example /Library/Frameworks/Python.framework/Versions/3.8/bin/scrapyd-deploy:23: ScrapyDeprecationWarning: Module `scrapy.utils.http` is…
merlin
  • 2,717
  • 3
  • 29
  • 59
1
vote
0 answers

Why does Scrapyd time out due to refused connection?

I am operating several cloud instances where scrapyd is scheduling scrapy crawlers that write to a remote db server (MySQL 8.x on Ubuntu 20.04). This worked for months. Suddenly it was not possible to deploy with scrapyd-deploy to one of the…
merlin
  • 2,717
  • 3
  • 29
  • 59
1
vote
1 answer

scrapyd-deploy with "deploy failed (400)"

I am trying to deploy with scrapyd-deploy to a remote scrapyd server, which failes without error message: % scrapyd-deploy …
merlin
  • 2,717
  • 3
  • 29
  • 59
1
vote
1 answer

Scrapyd-Deploy: Errors due to using os path to set directory

I am trying to deploy a scrapy project via scrapyd-deploy to a remote scrapyd server. The project itself is functional and works perfectly on my local machine and on the remote server when I deploy it via git push prod to the remote server. With…
merlin
  • 2,717
  • 3
  • 29
  • 59
0
votes
0 answers

Scrapyd deploy failing python 3.8

Stats: I start Scrapyd in env: (env) sh-3.2$ scrapyd 2023-01-18T14:44:21+0400 [-] Loading /Users/parikshit.mukherjee/PycharmProjects/nn/ufc-data-crawler/env/lib/python3.8/site-packages/scrapyd/txapp.py... 2023-01-18T14:44:21+0400 [-] Basic…
0
votes
0 answers

Cannot deploy spiders with scrapyd deploy (shows 0 spiders)

While attempting to deploy spiders to scrapyd running locally, I am getting the following response: {"node_name": "a-38u3442zr18hl", "status": "ok", "project": "project_name", "version": "1662563564", "spiders": 0} I have 6 spiders which are…
Tim_B
  • 129
  • 1
  • 1
  • 10
0
votes
0 answers

Scrapyd Spiders are going missing every 24 hours

I have a Scraypd server in heroku. It works fine and the spider works and connects to dbs without any issue. I have set it to run everyday by the scheduler in the Scrapydweb UI. However everyday the spider seems to disappear and I would have to…
0
votes
2 answers

Scrapyd spiders are finished but they are still being shown as Running on WebUI as well as listjobs.json

I have deployed Scrapyd as docker conainter on Google CloudRun. On my local, when I am running container, everything is working fine. But, when I am deploying same container on Google CloudRun, Spider jobs are not removed from Running queue. Though…
0
votes
1 answer

How to send scrapy command line argument to scrapyd-client

I want to schedule a spider from scrapyd-client by giving the command line arguments as well. e.g: scrapy crawl spider_name -a person="John" -a location="porto" -o local.csv Above command works well when running spider directly from scrapy, but it…
0
votes
1 answer

Run scrapyd in Python 3.6

I've been looking around and I can't seem to find an answer on how to run scrapyd in Python 3 and above. When I run it it keeps defaulting to python 2.7, though I recall reading in the docs or elsewhere that scrapyd supports…
Thorvald
  • 546
  • 6
  • 18
0
votes
0 answers

Multiple scrapy projects to one scrapyd project

I have multiple scrapy spiders. For every spider I have an own scrapy project like this: Scrapy Project 1 -> spider 1 Scapys project 2 -> spider 2 When I deploy one project to scrapyd it works fine and says there is one spider. But when I try to…
CIC3RO
  • 13
  • 4
0
votes
1 answer

scrapyd: How to include files into a deployd package

I am able to run a crawler localy which reads some input from a local file inside the scrapy project. Deployment with scrapyd-deploy failes, as the local file is somehow not in the package. inside scrapy project, open a file: with…
merlin
  • 2,717
  • 3
  • 29
  • 59