0

Say that you have a singleton to play with (which means that the only way to initialize that to original state is to restart the whole script) and you want to do a specific tasks multiple times on this and get the returned objects. Are there any ways I can do this without disk I/O? I know I can do that with subprocess.check_output() like How to spawn another process and capture output in python? and file I/O or piping stdio, but are there any cleaner solutions as simple as same-process communications (Edit: I mean, result = foo())?

#the singleton
def foo():
    crash_when_got_run_twice()
    result = something_fancy()
    return result

#the runner
def bar(times):
    for i in range(times):
        result = magic() # run foo()
        aggregate_result(result)
    return aggregated_result()

What do you think you can do in magic()?

xxbidiao
  • 834
  • 5
  • 14
  • 27
  • Which operating system? On unix-like systems you can fork without going back to disk, but with windows a subprocess reexecutes python and the script and this means I/O (maybe not all the way to disk, it its still cached in RAM). – tdelaney Jun 22 '17 at 21:48
  • I would like to see a platform-independent solution, but I'm mostly running this on unix-like. – xxbidiao Jun 23 '17 at 10:37

1 Answers1

1

On unix-like systems you can fork a subprocess and have it run the singleton. Assuming you've already imported everything needed by the singleton and the singleton itself doesn't touch the disk, it could work. Fair warning: whatever reason this thing was a singleton in the first place may nip you. You can let the multiprocessing package do the heavy lifting for you.

On windows, a new python interpreter is executed and there may be significant state passed between parent and child, which could have harmful effects. But again, it may work...

import multiprocessing as mp

#the singleton
def foo():
    crash_when_got_run_twice()
    result = something_fancy()
    return result

def _run_foo(i):
    return foo()

#the runner
def bar(times):
    with mp.Pool(min(mp.cpu_count(), times) as pool:
        return pool.map(_run_foo, range(times))    
tdelaney
  • 73,364
  • 6
  • 83
  • 116
  • One more question here, do they share the same global variable set or they will be separately initialized just like when I start a fresh copy? I would like to know whether different processes will interfere with each other in this method. – xxbidiao Jun 23 '17 at 17:41
  • I got different results from different implementations. Are there any difference between using subprocess spawning and running scripts fresh? – xxbidiao Jun 23 '17 at 18:08