1

Scrapy Crawlera was working just well in my Windows machine, but it gets error 111 when I run it in my linux server. Why is that?

When I use curl, I got this error: curl: (7) Failed connect to proxy.crawlera.com:8010; Connection refused

Aminah Nuraini
  • 18,120
  • 8
  • 90
  • 108

2 Answers2

1

It turned out when dealing with ports, CPanel (or maybe Linux?) blocks ports by default if it is not whitelisted in the firewall. I opened it via WHM since I use CPanel, and everything works fine now.

Aminah Nuraini
  • 18,120
  • 8
  • 90
  • 108
-1

It has most likely nothing to do with Windows or Linux, but with the user-agent of Scrapy. Try doing sth. like this in settings.py:

USER_AGENT = 'Mozilla/5.0 (X11; Linux x86_64; rv:7.0.1) Gecko/20100101 Firefox/7.7
j4hangir
  • 2,532
  • 1
  • 13
  • 14
  • Then why does it work in Windows? And I already use Scrapy Fake User Agent which sets random user agents automatically – Aminah Nuraini Oct 18 '18 at 03:14
  • @AminahNuraini `proxy.crawlera.com` is a proxy server, `user agent` should definitely not be the problem. I guess it just depends on `crawlera` itself. Maybe they limit your connection to one or some other mechanism. Try to wait a while or ask them for help. – Sraw Oct 18 '18 at 03:50