0

I am trying to create an application where a Tkinter GUI is updated by other objects that are continuously taking data. I was having issues using multithreading so I decided to try and use the multiprocessing module.

I've found that you cannot run a Tkinter window inside of a multiprocessing.Process, here is the minimum example:

import Tkinter as tk
import multiprocessing

class Subprocess(multiprocessing.Process):
    def __init__(self):
        multiprocessing.Process.__init__(self)
        self.root = tk.Tk()
    #

    def run(self):
        self.root.mainloop()
    #

    def stop(self):
        self.root.destroy()
        self.terminate()


if __name__ == '__main__':
        process = Subprocess()
        process.start()
        print "I got around the global interpreter lock"
        raw_input()
        print "exiting"
        process.stop()

What I expect to happen is for a Tk window to pop up and "I got around the global interpreter lock" to show up in the terminal. I tested this out on ubuntu linux and it worked fine, but when I switched over to Windows 7 (where I am developing my application) it failed giving me the error:

Traceback (most recent call last):
  File "C:\pathtoscript\multiprocessing_test.py", line 21, in <module>
    process.start()
  File "C:\Python27\lib\multiprocessing\process.py", line 130, in start
    self._popen = Popen(self)
  File "C:\Python27\lib\multiprocessing\forking.py", line 277, in __init__
    dump(process_obj, to_child, HIGHEST_PROTOCOL)
  File "C:\Python27\lib\multiprocessing\forking.py", line 199, in dump
    ForkingPickler(file, protocol).dump(obj)
  File "C:\Python27\lib\pickle.py", line 224, in dump
    self.save(obj)
  File "C:\Python27\lib\pickle.py", line 331, in save
    self.save_reduce(obj=obj, *rv)
  File "C:\Python27\lib\pickle.py", line 419, in save_reduce
    save(state)
  File "C:\Python27\lib\pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:\Python27\lib\pickle.py", line 649, in save_dict
    self._batch_setitems(obj.iteritems())
  File "C:\Python27\lib\pickle.py", line 681, in _batch_setitems
    save(v)
  File "C:\Python27\lib\pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:\Python27\lib\pickle.py", line 725, in save_inst
    save(stuff)
  File "C:\Python27\lib\pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:\Python27\lib\pickle.py", line 649, in save_dict
    self._batch_setitems(obj.iteritems())
  File "C:\Python27\lib\pickle.py", line 681, in _batch_setitems
    save(v)
  File "C:\Python27\lib\pickle.py", line 313, in save
    (t.__name__, obj))
PicklingError: Can't pickle 'tkapp' object: <tkapp object at 0x02BD3D08>

Does anyone know a workaround to this? It seems odd to me that this works on linux but not on Windows.

dano
  • 91,354
  • 19
  • 222
  • 219
trvrrp
  • 544
  • 1
  • 6
  • 11

1 Answers1

1

This is a simple fix - Just create the tkapp object in the child process, rather than the parent:

import Tkinter as tk
import multiprocessing
from Queue import Empty

class Subprocess(multiprocessing.Process):
    def __init__(self):
        multiprocessing.Process.__init__(self)
        self.queue = multiprocessing.Queue()
    #

    def run(self):
        self.root = tk.Tk()
        self.root.after(100, self._check_queue) # Check the queue every 100ms
        self.root.mainloop()

    def _check_queue(self):
        try:
            out = self.queue.get_nowait()
            if out == 'stop':
                self.do_stop()
                return
            # Could check for other commands here, too
        except Empty:
            pass
        self.root.after(100, self._check_queue)

    def stop(self):
        self.queue.put('stop')

    def do_stop(self):
        self.root.destroy()


if __name__ == '__main__':
    process = Subprocess()
    process.start()
    print "I got around the global interpreter lock"
    raw_input()
    print "exiting"
    process.stop()

Trying to create the tkapp in the parent, but then start in the child, isn't going to be a workable solution. The only tricky part is that you need to use a Queue to tell the loop in the child process to stop from the parent.

Also, for what its worth, running the original code on Linux actually crashes the interpreter for me:

XIO:  fatal IO error 11 (Resource temporarily unavailable) on X server ":0"
      after 87 requests (87 known processed) with 0 events remaining.
[xcb] Unknown sequence number while processing queue
[xcb] Most likely this is a multi-threaded client and XInitThreads has not been called
[xcb] Aborting, sorry about that.
python: ../../src/xcb_io.c:274: poll_for_event: Assertion `!xcb_xlib_threads_sequence_lost' failed.
dano
  • 91,354
  • 19
  • 222
  • 219
  • There is an issue for the code as you have written it, when it closes, i.e. when process.stop() gets called it throws an error: "AttributeError: 'Subprocess' object has no attribute 'root' " which is odd because the self.root attribute was set in the run function – trvrrp Nov 10 '14 at 01:05
  • @trvrrp Fixed now. Sorry, I missed the call to `stop` initially. You can't access `root` from the parent, because the attribute only exists in the child process. You have to use a `multiprocessing.Queue` to tell the child to terminate the mainloop instead. – dano Nov 10 '14 at 01:10
  • Is there a way to pass a target function into that subprocess? If it could handle arguments and return would be even better. – Honn Feb 05 '21 at 21:56