To complement Santiago's helpful answer:
The enumeration behavior of PowerShell's pipeline means that a receiving command fundamentally cannot tell the difference between pipeline input that was provided as (a) a single input object or (b) as a single-element array.
By contrast, when passing input as an argument, the target command can make such a distinction - if designed to do so - and ConvertTo-Json
does.
- However, it is rare for cmdlets to make this distinction - see GitHub issue #4242 for a discussion.
As an alternative to passing input by argument, PowerShell (Core) 7+ introduced the
-AsArray
switch, which requests that even a single input object (which may have been a single-element array originally) be treated as an array in its JSON representation.
# PS v7+ only; ditto for @(42) as input.
42 | ConvertTo-Json -AsArray -Compress # -> '[42]'
As iRon points out, you can achieve the same outcome by ensuring that a given array - even if it contains just one element - is sent through the pipeline as a whole, which also works in Windows PowerShell.
- Note: While with
ConvertTo-Json
it's much simpler to pass an array as an argument to ConvertTo-Json
, as shown in Santiago's answer, the techniques below may be of interest for commands that do not support passing array-valued arguments or support pipeline input only.
# Works in Windows PowerShell too.
# The unary form of the "," operator ensures that the array
# is sent *as a whole* through the pipeline.
, @(42) | ConvertTo-Json -Compress # -> '[42]'
The unary form of ,
, the array constructor ("comma") operator constructs what acts as a transient, auxiliary array here:
- Its one and only element is the input array.
- When the pipeline enumerates this array, its one and only element - the array of interest - is sent as a whole through the pipeline.
There's a less obscure - but less efficient - alternative, using Write-Output
with its -NoEnumerate
switch:
# Works in Windows PowerShell too.
# -NoEnumerate prevents enumeration of the input array
# and sends it through the pipeline as a whole.
Write-Output -NoEnumerate @(42) | ConvertTo-Json -Compress # -> '[42]'
Note:
While the result is the same as with the v7+ -AsArray
switch, the mechanism is different:
With the auxiliary-array / non-enumeration technique, ConvertTo-Json
truly receives an array as its one and only input object.
With the v7+ -AsArray
switch, when it receives a scalar (non-array) as its only input object, it still treats it as an array.
If multiple input objects are received, -AsArray
is a no-op, because even without this switch a JSON array must of necessity be output, given that ConvertTo-Json
alway collects its input up front and then outputs a single JSON document for it.
Do not use -AsArray
in combination with an argument (as opposed to pipeline input), as that will result in a nested JSON array, at least as of this writing (PowerShell 7.3.4):
ConvertTo-Json -AsArray -Compress @(42) # !! -> '[[42]]'
The design rationale behind PowerShell's enumeration behavior:
PowerShell is built around pipelines: data conduits through which objects stream, one object at a time.[1]
PowerShell commands output to the pipeline by default, and any command can write any number of objects, including none - and that number isn't known in advance, because it can vary depending on arguments and external state.
- E.g.,
Get-ChildItem *.txt
can situationally emit none, 1, or multiple objects.
Since the pipeline is just a stream of objects of unspecified count, there is no concept of an array in the pipeline itself, neither on input nor on output:
On input, arrays (and most enumerables)[2] are enumerated, i.e. the elements are sent one by one to the pipeline. Therefore, there is no difference between sending a scalar (single object) and sending a single-element array through the pipeline, as demonstrated above.
On output, multiple objects are simply output one at a time (though it is possible, but rare, to send an array (or other list-like type) as a whole, but it is then itself just another, single output object in the pipeline).
[1] You can introduce buffering of multiple objects with the common -OutBuffer parameter, but the next command in a pipeline still receives the buffered objects one by one.
[2] For details, see the bottom section of this answer.