64

I'm trying to create a JSON-serialized array. When that array contains only one item I get a string, not an array of strings (in JSON).

Multiple Items (works as expected):

PS C:\> @("one", "two") | ConvertTo-JSON
[
    "one",
    "two"
]

Single Item Array (not as expected):

PS C:\> @("one") | ConvertTo-JSON
"one"

Am I missing something?

Luggage
  • 1,676
  • 1
  • 19
  • 22

9 Answers9

103

Try without the pipeline:

PS C:\> ConvertTo-Json @('one', 'two')
[
    "one",
    "two"
]
PS C:\> ConvertTo-Json @('one')
[
    "one"
]
Ansgar Wiechers
  • 193,178
  • 25
  • 254
  • 328
  • 15
    Ahh yea. I see how using the pipeline would be ambiguous in this case. Thank you. You made me realize that it's not ConvertTo-JSON specific but a general powershell-array-pipline issue which lead me to: http://superuser.com/questions/414650/why-does-powershell-silently-convert-a-string-array-with-one-item-to-a-string – Luggage Sep 06 '13 at 18:03
  • 2
    @Luggage If only there was sanity: `@(@(1)) | ConvertTo-Json` -- still "NOPE" – user2864740 Oct 31 '17 at 04:21
  • 2
    Actually the pipeline approach was *wrong* in the first place, arrays are always iterated when passed to the pipeline. Just luckily the behavior of `ConvertTo-Json` is to collect all pipeline input in an array and output a single object, otherwise the result would have been 2 JSON objects, both a single string. – marsze Dec 27 '18 at 09:21
  • @marsze no that's just plain wrong, pipelines are a fantastic alternative to the reverse data-direction of the nested, arguments-based function-call paradigm. Bare lists are iterated in a pipeline, yes, but such a list can contain items of any object, *including arrays*. How else would this work to get the length of each strings' second word? `'hello there','it is i' | %{ ,@($_ -split ' ') } | %{ $_[1].Length }`. We can likewise do this for ConvertTo like so; `,@(1) | ConvertTo-Json`. ConvertTo craps itself, it's not operating correctly. – Hashbrown Aug 05 '20 at 17:04
  • @Hashbrown My wording might have been unfortunate. Any collection type (e.g. an array) is iterated when passed to the start of the pipeline. Of course, that only applies to the first level, as the elements and sub elements can be collection types itself. That's exactly what you demonstrated in your example, you pass in an array of strings and the split is performed on *each* single element. – marsze Aug 06 '20 at 07:49
  • Yeah, so as you can see you can feed ConvertTo-Json an array, but it still wont serialise it properly. It outputs `{values:1, count:1}`, there's something wrong with it, not pipelining (its so much the wording, but the apparent response to user2864740) – Hashbrown Aug 06 '20 at 10:42
30

I hit this problem as well but it was because my structure was too deep and ConvertTo-Json flattens everything below a certain depth to a string.

For example:

PS C:\> $MyObject = @{ "a" = @{ "b" = @{ "c" = @("d") } } }
PS C:\> ConvertTo-Json $MyObject
{
    "a":  {
              "b":  {
                        "c":  "d"
                    }
          }
}

To fix this, you can pass a larger value to -Depth

PS C:\> ConvertTo-Json $MyObject -Depth 100
{
    "a":  {
              "b":  {
                        "c":  [
                                  "d"
                              ]
                    }
          }
}
nkron
  • 19,086
  • 3
  • 39
  • 27
7

I just had the same issue and found out, that you can just append an -AsArray to the ConvertTo-Json command. Examples:

❯ @("one") | ConvertTo-Json -AsArray       
[
  "one"
]
❯ @("one", "two") | Convert-ToJson -AsArray
[
  "one",
  "two"
]
DSpirit
  • 2,062
  • 16
  • 22
  • That sounds like a useful switch but it doesn't exist for me. What version of PowerShell are you using? I'm on 5.1.19041.1320 – RSX Dec 06 '21 at 15:12
  • 2
    The `-AsArray` parameter is new in PowerShell 7.x, where this should be the recommended solution. – Björn Jarisch Feb 24 '22 at 10:26
6

Place a , in front of @:

,@("one") | ConvertTo-Json
[
  "one"
]
B-Art
  • 69
  • 1
  • 5
4

Faced the same issue today. Just to add, if you have an object like this

@{ op="replace"; path="clientName"; value="foo"}

then you have to specify it as

ConvertTo-Json @( @{ op="replace"; path="clientName"; value="foo"} )

The double @s can become confusing sometimes.

Saad
  • 198
  • 2
  • 12
4

I faced this issue with an array that was a child in an object. The array had one object in it, and ConvertTo-Json was removing the object in the array.

Two things to resolve this:

I had to set the -Depth parameter on ConvertTo-Json

$output = $body | ConvertTo-Json -Depth 10

I had to create the object in the array as a hashtable and then convert that to an object

$myArray.Add([pscustomobject]@{prop1 = ""; prop2 = "" })
jaycer
  • 2,941
  • 2
  • 26
  • 36
1

In my case, the prepared list was an array returned from a function. This was a simple array of strings. I had to re-wrap the array in @($ReturnedObject) before it would work.

ConvertTo-Json @($ReturnObject)

0

Thats not a good idea to keep it 5.1 and 7.x aware: 7.x does as you say, but 5.1 does this:

    {
    "value":  [
    "one"
              ],
    "Count":  1
}
Mr-Fly
  • 71
  • 1
  • 3
0

This is an old question - but weirdly Powershell 7.x documentation has THIS tidbit. https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/convertfrom-json?view=powershell-7.3

-NoEnumerate Specifies that output isn't enumerated.

Setting this parameter causes arrays to be sent as a single object instead of sending every element separately. This guarantees that JSON can be round-tripped via ConvertTo-Json.

This causes ConvertFrom-Json to create an array regardless of content. Talk about mysterious...

Robin Johnson
  • 357
  • 2
  • 11