I want to start a node.js script with PowerShell. The processed data should be come from an PowerShell array.
Is this possible?
I want to start a node.js script with PowerShell. The processed data should be come from an PowerShell array.
Is this possible?
Using command-line arguments, you cannot pass an array as such to an external program, such as node
, via its CLI, as there is no such construct built in.
There are two workarounds:
(a) Pass the array elements as individual arguments
(b) Use a custom representation of the argument inside a single string argument, which the target program must parse.
A simple demonstration of (a):
# Construct a PowerShell array and pass its elements as individual
# arguments to an ad-hoc Node.js script:
PS> $arr = 'one', 'two'; node -pe 'process.argv.slice(1)' $arr
[ 'one', 'two' ]
Note: If you pass the path of a script file, use 2
rather than 1
as the offset, because element 1
of argv
is then the script's path.
Note how simply passing the PowerShell $arr
as-is made PowerShell pass its elements as individual arguments.
A simple demonstration of (b):
# Pass the array as a single string containing the space-separated
# elements, and convert that string to an array in JavaScript.
PS> $arr = 'one', 'two'; node -pe 'process.argv[1].split(/ /)' "$arr"
[ 'one', 'two' ]
"$arr"
, i.e. enclosing the array in "..."
, implicitly stringifies it, resulting in a space-separated list of its (stringified) elements (you can change the separator via the $OFS
preference variable).
The JavaScript .split()
method converts the string back to an array.
If you need to support array elements with embedded spaces and possibly even data types other than string, a more sophisticated approach is needed.
As Lee Dailey suggests, you can use JSON for that:
# Construct a PowerShell array and pass it as a single argument
# in JSON form, which JavaScript can easily parse:
PS> $arr = 'one', 2; node -pe 'JSON.parse(process.argv[1])' (ConvertTo-Json $arr).Replace('"', '\"')
[ 'one', 2 ] # Note that the numeric element was preserved as such.
Note the unfortunate need to call .Replace('"', '\"')
on ConvertTo-Json
's output in order to manually \
-escape the "
chars. in the JSON text, which shouldn't be necessary, but as of PowerShell Core 7.1 still is due to PowerShell's broken argument-passing to external programs - see this answer.
You can avoid the escaping headaches if you pass the JSON text via the pipeline, which the Node.js script can read via stdin:
# Construct a PowerShell array, convert it to JSON, and send it
# via the *pipeline* to the Node.js script.
# NOTE: On *Windows PowerShell*, run
# $OutputEncoding = [Text.Utf8Encoding]::new($false)
# first, to ensure that non-ASCII characters are properly encoded.
# PowerShell [Core] v6+ already defaults to BOM-less UTF-8.
$arr = 'one', 2
ConvertTo-Json $arr |
node -pe "JSON.parse(require('fs').readFileSync(0).toString())"
The result is the same as above.
(Note that the Node.js command argument (the string passed to -pe
) too is subject to the escaping headaches, which were avoided in this case by using "..."
as the outer quoting and '...'
as the inner quoting, given that '
chars. need no escaping).