1

Scenario

I am invoking python scripts using exec(...) in php, and python works its part and it is unstructured environment, there is no django or any other framework.

And a product is client based product & most of the tasks are long running tasks.

So if there is any request from client, it must be satisfied immediately. Here for us, time is also a main concern we can not delay the process.

So if I will get too much requests from client (May be, crons are also working at the same time), I will get too much python processes and according to my knowledge each exec(...) will invoke interpreter. So if anyhow I can prevent invoking the interpreter at each call, it will be good for me. Memory space of 1Kb or 1Mb is very useful.

Problems

  1. Memory usage is high (If I can save 1Mb, I want try any solution. Assuming script is well written. Proper sleep is given in looping and garbage collector is managed well.)
  2. Sometimes for long running python script, if that script is invoked from php, using exec(...) and process is not invoked in background, and that python script is over, and still it takes time to execute the next line after that exec.

Solution

For the solution, I thought I should use API calls for python. But remember I can not use any framework. To use framework I need to rewrite the code, and that is not possible in my case. So CGI and FCGI can help me here.

To run python scripts from API calls, I need to enable CGI to run.

But what I found is interesting ::

Programs using CGI, to communicate with their web server, need to be started by the server for every request. So, every request starts a new Python interpreter – which takes some time to start up – thus making the whole interface only usable for low load situations.

common-gateway-interface.

If I will run a script like following, than all will start at once

subprocess.Popen(["python", "program1.py"])
subprocess.Popen(["python", "program2.py"])
subprocess.Popen(["python", "program3.py"])
subprocess.Popen(["python", "program4.py"])
subprocess.Popen(["python", "program5.py"])
subprocess.Popen(["python", "program6.py"])

Assume this all programs are long running task.

1) Can I prevent python module from creating more interpreter and if yes how can I?

2) How many maximum interpreters, a python module can create (at once like the above code) ?

Ujjaval Moradiya
  • 222
  • 1
  • 12
  • If that is your script, then it'll spin up 6 python interpreters... – mgilson Dec 09 '16 at 05:26
  • Post is modified little bit. And if have 100 scripts, than it will create 100 interpreter. So how many interpreter a python module can handle? I am facing this scenario on my own server. – Ujjaval Moradiya Dec 09 '16 at 05:36

1 Answers1

0

I'm not sure if I fully understood what you asked. In generally, in any context, code is executed sequentially unless explicitly told not to do so. If you put those lines in your shell, your shell first start an instance of python for program1.py. You are not able to proceed to execute program2.py until you finish program1.py. So you have only one python interpreter on at any time point.

If you want to run each script as a background process, you need to put & at the end of each line this python program1.py &. In this way you can run several scripts at the same time. If your scripts take long to finish, you will have 6 interpreters running at the same time. I'm pretty sure every script needs exactly one interpreter designated to it.

  • Yes I came to know from the comment that it will create 6 interpreter. How can I prevent this? – Ujjaval Moradiya Dec 09 '16 at 05:59
  • 1
    To prevent your script from starting six separate interpreters, do not include code in it that starts six separate interpreters. – kindall Dec 09 '16 at 06:13
  • I think it's just a normal behavior to have multiple instances of Python if you work with CGI. There are nothing wrong about it and workaround may not be easy. If you don't like that, I recommend you to start to use web framework like Django. –  Dec 09 '16 at 06:24
  • `Popen` is [non-blocking](http://stackoverflow.com/questions/21936597/blocking-and-non-blocking-subprocess-calls) so I'm not sure the parent would wait until one subprocess is finished before calling another. – roganjosh Dec 09 '16 at 07:17
  • @roganjosh No, parent will not wait for any subprocess to complete its execution. It will just call and start executing next lines. That behavior is normal when I will have very small amount of python scripts are running at a time. Yes Django is good, Django will not start interpreter at each call, but the next question will arise how Django is doing this. – Ujjaval Moradiya Dec 10 '16 at 05:31
  • @UjjavalMoradiya , Django is a web framework. It will wait for any http request, and when it gets a http request it process the request and gives out the response. This internally when you go deep into the code, the basic thing that enables multiple work(request) is socket. So it is basically a socket connection. There is a single python process with socket server that will process multiple requests. – fury.slay Sep 13 '17 at 12:44