So, I have the following monstrosity: a python2 script that needs to use a bunch of modules that are only available for python2, and some functionality for which I need to use libraries that don't work with the (old) python2 version I'm using. So I figured I might as well use the latest python version to get that functionality implemented, i.e. python3. So what I do now (in the python2 script) is use
subprocess.call(["/path/to/python3", "python3_script.py", "argument", "more argument"])
and then in the python3 script, I do
some_variable = sys.argv[1]
other_variable = sys.argv[2]
It's not pretty but it works because until now I only needed to pass simple strings.
However now I need to send more complex and large data structures (essentially dicts of objects) and while I could theoretically strip out all methods of the objects, reimplement them as freestanding functions, manually serialize the dicts and de-serialize them on the python3 side, I'd like to use something more robust and less labor intensive.
As I see it I have two options - use a portable serialization method (but that won't let me use objects with methods) or find some way to share object definitions and data between a python2 and python3 instance.
So, say I have a module called Foo, and in it I define a class Foo with some methods, can I use that from a python2 and a python3 process running at the same time? More specifically, will the .pyc files that are generated differ and interfere with each other?
Secondly, is there a way (either in the language or a library) that will let me serialize a data structure in python2, pass it as a string to a python3 script, and let me then deserialize it correctly from the python3 script?