3

I'm authoring a PowerShell module with a bunch of different functions that share a common set of parameters. With all the parameter metadata, these functions are becoming rather tough to manage. One small change to a parameter, adding a new parameter, or removing a parameter, means that I have to go update a bunch of different function files.

Aside from templatizing code / code-gen, does anyone have a recommended pattern for sharing parameter definition blocks across many functions in PowerShell?

  • 1
    One thing that I did when I was working with several functions is I started using a hashtable to pass information around between functions. Instead of several parameters I would store everything in the hashtable and only require that and maybe a few other parameters. It made it much easier to move large amounts of data around. – Jason Snell Jul 12 '17 at 15:42
  • @Jason +1 - Also found this approach useful. In my use case it was to override behaviour in a function so could be omitted, 1 or n number of arguments. – G42 Jul 12 '17 at 16:02

2 Answers2

4

I feel your pain. I've run into this many, many times. Unfortunately there's no great, native PowerShell solution that solves all of it. A few possible options:

Write Tests

Simply: use Pester, write tests. This doesn't help you reduce the amount of copy/pasting you have to do, or re-typing the same parameter blocks, but what it does do is ensure that if you do change a parameter in one place, and forget to change it somewhere else, or make an incompatible change, the tests will fail.

Of course that means you have to write tests in a way that will catch those problems, but you should be anyway.

I still don't write tests nearly as often as I should, but I think this option provides the best overall outcome, though it has a high-ish barrier to entry.

Use Strongly-Typed Custom Classes for Params

At some point, you might realize that a group of parameters needs to be passed all over the place, and really, those parameters are all describing different aspects (properties) of a certain unit or concept (a class).

You'll recognize that this is the case when you start realizing that the parameters depend on each other (more than you describe on a parameter set), so you might start writing validation for it and realize that it's getting very complicated.

The solution here is that these parameters should be properties of a class, so that the class takes care of validating the values of and relationships between the properties.

This will become especially apparent and important when you realize that you have more than one of these "groups" in your parameters, and you need multiple objects. Writing parameter sets and validation becomes exponentially more complicated with any more than 1 of these, and breaking them out into classes will alleviate that in a huge way.

Drawbacks

Writing classes in PowerShell kind of sucks. Additionally, there's no good way to export a class that's defined in the PowerShell module, which means your functions are aware of it, but the consumer of the module is not.

That in turn, means that you cannot use the class as a parameter of a public function. There are janky ways to export it, so you could do that, but...

Alternatively

Write the classes in C#. You don't need to do anything special with an assembly (.dll) to use it in PowerShell, as long as the class is public. Import-Module just works.

But of course, that's an entirely different development pipeline, build process, distribution, etc.

Templating / Code-gen

You touched on it, but.. seriously, don't do it.

briantist
  • 45,546
  • 6
  • 82
  • 127
  • 1
    If you have a large module with many reusable parameters I'd definitely go for writing it in `C#` – Mathias R. Jessen Jul 12 '17 at 16:35
  • @MathiasR.Jessen I would too, but for many it's a large hurdle. Forgetting about the syntax differences and change in mindset between writing (mostly procedural) scripts and OOP, beyond that there's the large difference in workflow as I mentioned (build process, versioning, distribution/packaging). It has many advantages but there is a non-trivial extra workload to manage it properly that you don't have (or do differently) with only scripted processes. – briantist Jul 12 '17 at 17:30
0

If the functions share that many parameters, I suspect they have a lot of code in common as well. If that is the case, I would consider the following approach which I used for a logging framework that includes functions like Log-Entry, Log-Debug and Log-Verbose:

  • Create a single main function (in my case Log-Entry)
  • Create aliases for each addition function (in my case Log-Debug and Log-Verbose)
  • Divert the functionally by checking the function name using $MyInvocation.InvocationName

Like:

Function Log-Entry {
<#
.Synopsis
    Log-Entry
.Description
    Displays and records cmdlet processing details in a file
#>
Param(
    ...
)
    ...
    If ($MyInvocation.InvocationName -eq "Log-Debug") {
        ...
    }
    If ($MyInvocation.InvocationName -eq "Log-Verbose") {
        ...
    }
    ...
}
Set-Alias Log-Debug    Log-Entry -Description "By default, the Log-Debug entry is not displayed and not recorded, but you can display it by changing the common -Debug parameter."
Set-Alias Log-Verbose  Log-Entry -Description "By default, the Log-Verbose entry is not displayed, but you can display it by changing the common -Verbose parameter."
iRon
  • 20,463
  • 10
  • 53
  • 79