0

Is there a best practice for handling JSON documents with duplicated keys in PowerShell?

Ideally would like to compile the values of said keys into a single array mapped to the key.

For example:

{
"column01" : "id1",
"column02" : "id2",
"column03: : "id3",
"column03" : "id4"
}

Transformed to:

{
"column01" : "id1",
"column02" : "id2",
"column03: : [
             "id3",
             "id4"
             ]
}

I have been exploring options with the ConvertTo-Json cmdlet, but have not found a solution.

Appreciate the help!

EdTheHorse
  • 61
  • 1
  • 3

1 Answers1

1

While JSON does allow duplicate keys, this is not recommended and I suggest handling this by normalizing the JSON. Otherwise you can certainly pass your JSON with the duplicate key to ConvertFrom-Json but this won't result in your desired output.

It should be...

$obj =     @"
{ "column01" : "id1",
  "column02" : "id2",
  "column03" : ["id3", "id4"]

}
"@

Then using $json = $obj | ConvertFrom-Json to convert into a powershell obj.

Then you can do the same with a PowerShell object and convert to JSON.

$obj =     @{ 
  "column01" = "id1";
  "column02" = "id2";
  "column03" = ("id3", "id4")

}    

$json = $obj | ConvertTo-Json

$json

If you want to know how to normalize the data, I suggest you either edit your question or ask a new one.

mklement0
  • 382,024
  • 64
  • 607
  • 775