0

I have 2 doit tasks, one having a dependency on the other. For example:

def task_deploy():
    return {
        'actions': ['do some deploy commands'],
        'file_dep': ['dist'],
        'params': [{'name': 'projectName',
                    'short': 'p',
                    'long': 'projectName',
                    'default': 'project',
                    'type': str,
                    'help': 'The project name to deploy.'}]
        }

def task_create_distibution_archive():
    return {
        'actions': ['do something that requires projectName'],
        'doc': 'Creates a zip archive of the application in "dist"',
        'targets': ['dist']
    }

Is there a way to share or pass the arguments of a task to another one? I have read pretty much everything I could on task creation and dependency on pydoit.org, but haven't found anything similar to what I want.

I am aware that I could use yield to create these two tasks at the same time, but I'd like to use a parameter when executing the task, not when I am creating it.

schettino72
  • 2,990
  • 1
  • 28
  • 27
Jonathan
  • 153
  • 8

2 Answers2

2

Is there a way to share or pass the arguments of a task to another one?

Yes. Using getargs: http://pydoit.org/dependencies.html#getargs

In your example, you would need to add another action to the task deploy just to save the passed parameter.

schettino72
  • 2,990
  • 1
  • 28
  • 27
  • Since the time I posted this question, the plugin I was coding evolved quite a bit, but I did use `getargs` as a first solution, and that worked well. I had to change my solution to parse the arguments myself because of a limitation when using `getargs`, as it doesn't expose a method to get **all** arguments, and I had a need to accept arbitrarily named arguments. And not that I want to bash @gocards response, but the goal of passing arguments between tasks was to avoid globals. I'm sure uncle Bob would do the same ;) – Jonathan Jun 08 '16 at 14:34
0

You could just use a global variable like commonCommand. If you have more complex needs, create a class to handle it.

class ComplexCommonParams(object):
    def __init__(self):
        self.command = 'echo'
params = ComplexCommonParams()
commonCommand='echo'
def task_x():
    global commonCommand
    return {
        'actions': [ commonCommand + ' Hello2 > asdf' ],
        'targets': ['asdf']
        }
def task_y():
    global commonCommand
    return {
        'actions': [ commonCommand+'  World' ],
        'file_dep': ['asdf'],
        'verbosity':2}
goCards
  • 1,388
  • 9
  • 10
  • Does this not cause issues if you're running tasks in parallel? As I understand it, each task is then spawned as a separate process using `multiprocessing`, so it won't have access to the parent processes' globals. – Harry Apr 21 '17 at 13:47
  • 1
    When the parent forks, it gives a copy of everything in the parent to the child. If the child modifies the global, then the parent cannot see it. – goCards Apr 28 '17 at 01:20