3

I have Flask App deployed with nginx and gunicorn.

while requesting post request from my flask app, it is executed twice.

@app.route("/search", methods=['POST'])
def search():
        r = requests.post('http://localhost:6800/schedule.json', data='{"project":"rental", "spider":"airbnb"}')
        return json.dumps(r.json())

Here 2 different spider jobs are created.

Ganesh Pandey
  • 5,216
  • 1
  • 33
  • 39

1 Answers1

1

It is because that the reloader spawns two processes when you run it. You can disable it by settting debug=False or use_reloader=False when you are running it.

Check out the answers of the questions:

Community
  • 1
  • 1
Özgür Eroğlu
  • 1,230
  • 10
  • 16
  • 1
    i am in production environment not in dev – Ganesh Pandey Feb 12 '17 at 20:06
  • So your app is a single process, but creating two spider processes? – Özgür Eroğlu Feb 12 '17 at 20:17
  • Could you please share controller code receiving the post. The code above is just posting. – Özgür Eroğlu Feb 12 '17 at 20:28
  • what is the max_proc setting for the scrapyd. Seems as if the daemon starts seperate threads of the same spider. The documentation from the scrapyd "Scrapyd also runs multiple processes in parallel, allocating them in a fixed number of slots given by the max_proc and max_proc_per_cpu options, starting as many processes as possible to handle the load." – Özgür Eroğlu Feb 12 '17 at 20:50
  • Let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/135528/discussion-between-ganesh-pandey-and-ozgur-eroglu). – Ganesh Pandey Feb 12 '17 at 20:59