4

I have a python instance running on a linux server. I have created a global array using global class. I want pass object of this class as a command line argument to a python function which I will run on a windows VM. How do I pass the object as a commandline argument in python? Or is there any better way to do it?

Flying falcon
  • 133
  • 2
  • 9
  • What objects are we talking about here? A serialized generic object, or Python string/int/dict/... literals as strings? – timgeb Feb 26 '16 at 16:05
  • @cdarke well technically you could encode the command line argument and then unpickle, but yeah... – timgeb Feb 26 '16 at 16:15
  • 1
    *"Or is there any better way to do it?"* Any better way to do what? What is the problem that you are trying to solve? – mzjn Feb 26 '16 at 16:21
  • What is the object? What does it represent? – Tom Dalton Feb 26 '16 at 16:21
  • I have a class globalVars() which I use to initialise some global variables using get() and set() methods so that I can use these variables in any modules in my file structure just by creating an object of that class. Now I want to pass that object to another instance of Python which I will run on a Windows VM CMD.(Basically I want to pass all those variables to the Windows VM). P.S. I am new to Python, so I might be missing some concepts here :) – Flying falcon Feb 26 '16 at 19:54
  • What sort of objects are the variables? That matters quite a bit. If they're just builtins like dictionaries, lists, strings, and ints, you have an easier path ahead of you than if you're passing around other objects. – pydsigner Feb 26 '16 at 20:17
  • Thank you all, I used pickle and it worked just fine :) – Flying falcon Mar 01 '16 at 10:57

2 Answers2

3

You can use json.dumps() and json.loads() or pickle.dumps() and pickle.loads() for this purpose:

>>> import json
>>> json.dumps(['Hi'])
'["Hi"]'
>>> json.loads(_)
['Hi']

>>> import pickle
>>> pickle.dumps(['Hi'])
b'\x80\x03]q\x00X\x02\x00\x00\x00Hiq\x01a.'
>>> pickle.loads(_)
['Hi']

Note that if you are trying to pass in a special class you will have to do some extra work; you'll need to have function to convert to and from the JSON format,1 while pickle will do things automatically but will still need access to the class.2

However, I think you'd be best off running a task execution server in the VM. While the primary focus of these server options is to allow scalability, they're quite good at the remote aspects as well. This abstracts away all of the communication and serialization solutions that, as @J.F. Sebastian said, you really don't need to reinvent.

Celery is probably the most commonly used task execution server library. It takes some work to set up, but is simple to use once configured: mark your function with a Celery decorator to make it a task object, start the worker on the VM, import the module, and call a class method with the same arguments that you would pass to the function itself.3 Once everything is working right, the Celery worker can be set up as a Windows service.4

# app.py (adapted from examples in the Celery Getting Started tutorial
from celery import Celery

app = Celery('tasks', broker='amqp://guest@localhost//')

@app.task
def my_function(a, b):
    return a * b


# main.py
import app

result = app.my_function.delay(4, 5)
print result.get()

Sometimes though, Celery is just too much hassle. If you need to use third-party libraries from the function, you'll either have to import them in the function or have them installed on the Linux server as well, due to Celery's intuitive arrangement. And I've personally had troubles getting Celery set up in the first place.

A simpler alternative is TaskIt.5 (Full disclosure: I am the developer of TaskIt.) It uses a more traditional server-client connection style, so all that has to work is a standard TCP socket. By default it uses JSON to serialize objects, but pickle is also supported.

# server.py
from taskit.backend import BackEnd

def my_function(a, b):
    return a * b

backend = BackEnd(dict(my_function=my_function))
backend.main()


# client.py
from taskit.frontend import FrontEnd

backend_addr = '127.0.0.1'
frontend = FrontEnd([backend_addr])
print frontend.work('my_function', 4, 5)
pydsigner
  • 2,779
  • 1
  • 20
  • 33
  • You can't just send the output of pickle or json to a bash command line, because bash uses a different format than the others, so you'll almost certainly get invalid command errors. – Cerin Dec 17 '20 at 23:41
  • @Cerin you can indeed pass such data through the command line, though it will need to be escaped using something like `shlex.quote()`. – pydsigner Dec 24 '20 at 03:12
0

Use any method that you would use, to communicate between processes that are running on different computers.

multiprocessing module from stdlib supports this use-case. Jupyter supports remote kernels. Here's code example that uses execnet.

You can pass an object as a command-line argument if you serialize it to a string first. But there is no need to create yet another way to execute python code remotely.

Community
  • 1
  • 1
jfs
  • 399,953
  • 195
  • 994
  • 1,670