1

My goal is create one main python script that executes multiple independent python scripts in windows server 2012 at the same time. One of the benefits in my mind is that I can point taskscheduler to one main.py script as opposed to multiple .py scripts. My server has 1 cpu. I have read on multiprocessing,thread & subprocess which only added to my confusion a bit. I am basically running multiple trading scripts for different stock symbols all at the same time after market open at 9:30 EST. Following is my attempt but I have no idea whether this is right. Any direction/feedback is highly appreciated!

import subprocess

subprocess.Popen(["python", '1.py'])
subprocess.Popen(["python", '2.py'])
subprocess.Popen(["python", '3.py'])
subprocess.Popen(["python", '4.py'])
gibbz00
  • 1,947
  • 1
  • 19
  • 31
  • try using queuing , for example sqs – pushpendra chauhan Nov 28 '17 at 18:52
  • 2
    I love Python -- but this may be one of those times when bash would be a better tool for you: https://stackoverflow.com/questions/28549641/run-multiple-python-scripts-concurrently – SteveJ Nov 28 '17 at 18:53
  • Are the python scripts related in some way or completely unrelated? If yes, what's the relation? – stefreak Nov 28 '17 at 18:55
  • 2
    Bash? On Windows Server 2012? – mikeb Nov 28 '17 at 18:55
  • @stefreak The scripts are independent. They are only related in the fact that all the scripts connect to my broker's api and gather some data about individual stocks. Each script has a unique stock symbol. – gibbz00 Nov 28 '17 at 19:00
  • @pushpendrachauhan will queuing run the scripts concurrent? the scripts gather time sensitive data from my broker. If yes, please post it as an answer. – gibbz00 Nov 28 '17 at 19:02
  • @SteveJ I am on windows server 2012. – gibbz00 Nov 28 '17 at 19:02
  • @gibbz00; Sorry - didn't read well enough. – SteveJ Nov 28 '17 at 19:22
  • 1
    @SteveJ bash -> batch. pretty much the exact same can be accomplished with `START "" "path-to-python" "path-to-script"` – Aaron Nov 28 '17 at 19:24

3 Answers3

3

I think I'd try to do this like that:

from multiprocessing import Pool

def do_stuff_with_stock_symbol(symbol):
    return _call_api()

if __name__ == '__main__':
    symbols = ["GOOG", "APPL", "TSLA"]
    p = Pool(len(symbols))
    results = p.map(do_stuff_with_stock_symbol, symbols)
    print(results)

(Modified example from multiprocessing introduction: https://docs.python.org/3/library/multiprocessing.html#introduction)

Consider using a constant pool size if you deal with a lot of stock symbols, because every python process will use some amount of memory.

Also, please note that using threads might be a lot better if you are dealing with an I/O bound workload (calling an API, writing and reading from disk). Processes really become necessary with python when dealing with compute bound workloads (because of the global interpreter lock).

An example using threads and the concurrent futures library would be:

import concurrent.futures

TIMEOUT = 60

def do_stuff_with_stock_symbol(symbol):
    return _call_api()

if __name__ == '__main__':
    symbols = ["GOOG", "APPL", "TSLA"]

    with concurrent.futures.ThreadPoolExecutor(max_workers=len(symbols)) as executor:
        results = {executor.submit(do_stuff_with_stock_symbol, symbol, TIMEOUT): symbol for symbol in symbols}

        for future in concurrent.futures.as_completed(results):
            symbol = results[future]
            try:
                data = future.result()
            except Exception as exc:
                print('{} generated an exception: {}'.format(symbol, exc))
            else:
                print('stock symbol: {}, result: {}'.format(symbol, data))

(Modified example from: https://docs.python.org/3/library/concurrent.futures.html#threadpoolexecutor-example)

Note that threads will still use some memory, but less than processes.

You could use asyncio or green threads if you want to reduce memory consumption per stock symbol to a minimum, but at some point you will run into network bandwidth problems because of all the concurrent API calls :)

stefreak
  • 1,460
  • 12
  • 30
  • Wow, this might be a better approach than the solution I imagined. So do I put all the regular code after the return call within the do_stuff_with_stock_symbol(symbol) method? Any idea how to perform the exact thing with threading as I have only 1GB ram in my server. – gibbz00 Nov 28 '17 at 19:11
  • 1
    instead of return `_call_api()` you can do (probably almost) anything you want. I added a threaded example. – stefreak Nov 28 '17 at 19:20
  • Thank you for the answer! Do you recommend the threaded example given you have a good idea of what I am trying to do which is run multiple trading scripts for different symbols all at the same time after market open at 9:30 EST? – gibbz00 Nov 28 '17 at 19:23
  • Not sure if I have enough info to recommend anything, but for I/O bound workloads threads and asyncio make more sense than processes IMO. If this helped you, please mark my answer as accepted :) – stefreak Nov 28 '17 at 19:26
  • In my scripts, symbol is not the only thing that is different. I use IB as a broker and 32 different clientIDs can connect to the API simultaneously. I use a different clientID for each .py script along with different symbols. Please let me know if can also incorporate clientID = [1,2,3] in your code? – gibbz00 Nov 28 '17 at 19:39
  • 1
    @gibbz00 In other words: If you are not trying to call the API *AND* compute thousands of prime numbers, use threads or asyncio! :) – stefreak Nov 28 '17 at 19:41
  • @gibbz00 if you have a limited number of client ids, assuming you can only make one simultanious request per client ID, one solution would be to put them all into a queue in the beginning. Then you `get()` a client id before making the api call and `put()` it back afterwards. https://docs.python.org/2/library/queue.html – stefreak Nov 28 '17 at 19:45
  • (That is only possible if you are using threading, not with multiple processes) – stefreak Nov 28 '17 at 19:45
  • Let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/160029/discussion-between-stefreak-and-gibbz00). – stefreak Nov 28 '17 at 19:48
2

While what you're asking might not be the best way to handle what you're doing, I've wanted to do similar things in the past and it took a while to find what I needed so to answer your question:

I'm not promising this to be the "best" way to do it, but it worked in my use case.

I created a class I wanted to use to extend threading.

thread.py

"""
Extends threading.Thread giving access to a Thread object which will accept
A thread_id, thread name, and a function at the time of instantiation. The
function will be called when the threads start() method is called.
"""

import threading


class Thread(threading.Thread):
    def __init__(self, thread_id, name, func):
        threading.Thread.__init__(self)
        self.threadID = thread_id
        self.name = name

        # the function that should be run in the thread.
        self.func = func

    def run(self):
        return self.func()

I needed some work done that was part of another package

work_module.py

import...

def func_that_does_work():
    # do some work
    pass

def more_work():
    # do some work
    pass

Then the main script I wanted to run main.py

from thread import Thread
import work_module as wm


mythreads = []
mythreads.append(Thread(1, "a_name", wm.func_that_does_work))
mythreads.append(Thread(2, "another_name", wm.more_work))

for t in mythreads:
    t.start()

The threads die when the run() is returned. Being this extends a Thread from threading there are several options available in the docs here: https://docs.python.org/3/library/threading.html

Dave
  • 592
  • 3
  • 15
1

If all you're looking to do is automate the startup, creating a .bat file is a great and simple alternative to trying to do it with another python script.

the example linked in the comments shows how to do it with bash on unix based machines, but batch files can do a very similar thing with the START command:

start_py.bat:

START "" /B "path\to\python.exe" "path\to\script_1.py"
START "" /B "path\to\python.exe" "path\to\script_2.py"
START "" /B "path\to\python.exe" "path\to\script_3.py"

the full syntax for START can be found here.

Aaron
  • 10,133
  • 1
  • 24
  • 40