1

I would like to write out a hash table to a file with an array as one of the hash table items. My array item is written out, but it contains files=System.Object[]

Note - Once this works, I will want to reverse the process and read the hash table back in again.

clear-host
$resumeFile="c:\users\paul\resume.log"
$files = Get-ChildItem *.txt
$files.GetType()
write-host
$types="txt"
$in="c:\users\paul"

Remove-Item $resumeFile -ErrorAction SilentlyContinue
$resumeParms=@{}
$resumeParms['types']=$types
$resumeParms['in']=($in)
$resumeParms['files']=($files)
$resumeParms.GetEnumerator() | ForEach-Object {"{0}={1}" -f $_.Name,$_.Value} | Set-Content $resumeFile
write-host "Contents of $resumefile"
get-content $resumeFile

Results

IsPublic IsSerial Name                                     BaseType                                                      
-------- -------- ----                                     --------                                                      
True     True     Object[]                                 System.Array                                                  

Contents of c:\users\paul\resume.log
files=System.Object[]
types=txt
in=c:\users\paul
Paul Wasserman
  • 137
  • 2
  • 13
  • 1
    use the `Export-CliXml` cmdlet - that is what it is for. [*grin*] – Lee_Dailey Apr 16 '20 at 12:55
  • 1
    This is good, I like it and I could do this, but I'm trying to create a simple file that can be easily read and modified by users. XML based input/output isn't always user friendly. If this is the only option, then I may have to write out the files array as an XML into a separate file. Thank you Lee – Paul Wasserman Apr 16 '20 at 13:18
  • 1
    you are welcome! [*grin*] ///// so, if XML is too much for folks to work with, have you tried JSON? i don't have any experience with that format, but it is designed to be a more-human-friendly text format than XML. – Lee_Dailey Apr 16 '20 at 13:23
  • Possible duplicate with: [How can I write a nested arbitrary associative Array value set to a .psd1 file in powershell?](https://stackoverflow.com/q/41107531/1701026), [Update PSData Properties in Powershell Module Manifest through a PS Script](https://stackoverflow.com/q/51179602/1701026) and/or [Does PowerShell support HashTable Serialization?](https://stackoverflow.com/q/60621582/1701026) – iRon Apr 19 '20 at 08:05

2 Answers2

1

The immediate fix is to create your own array representation, by enumerating the elements and separating them with ,, enclosing string values in '...':

# Sample input hashtable. [ordered] preserves the entry order.
$resumeParms = [ordered] @{ foo = 42; bar = 'baz'; arr = (Get-ChildItem *.txt) }

$resumeParms.GetEnumerator() |
  ForEach-Object { 
    "{0}={1}" -f $_.Name, (
      $_.Value.ForEach({ 
       (("'{0}'" -f ($_ -replace "'", "''")), $_)[$_.GetType().IsPrimitive] 
      }) -join ','
    )
  }

Not that this represents all non-primitive .NET types as strings, by their .ToString() representation, which may or may not be good enough.

The above outputs something like:

foo=42
bar='baz'
arr='C:\Users\jdoe\file1.txt','C:\Users\jdoe\file2.txt','C:\Users\jdoe\file3.txt'

See the bottom section for a variation that creates a *.psd1 file that can later be read back into a hashtable instance with Import-PowerShellDataFile.


Alternatives for saving settings / configuration data in text files:

  • If you don't mind taking on a dependency on a third-party module:

    • Consider using the PSIni module, which uses the Windows initialization file (*.ini) file format; see this answer for a usage example.

      • Adding support for initialization files to PowerShell itself (not present as of 7.0) is being proposed in GitHub issue #9035.
    • Consider using YAML as the file format; e.g., via the FXPSYaml module.

      • Adding support for YAML files to PowerShell itself (not present as of 7.0) is being proposed in GitHub issue #3607.
  • The Configuration module provides commands to write to and read from *.psd1 files, based on persisted PowerShell hashtable literals, as you would declare them in source code.

    • Alternatively, you could modify the output format in the code at the top to produce such files yourself, which allows you to read them back in via
      Import-PowerShellDataFile, as shown in the bottom section.

    • As of PowerShell 7.0 there's no built-in support for writing such as representation; that is, there is no complementary Export-PowerShellDataFile cmdlet. However, adding this ability is being proposed in GitHub issue #11300.

  • If creating a (mostly) plain-text file is not a must:

  • The solution that provides the most flexibility with respect to the data types it supports is the XML-based CLIXML format that Export-Clixml creates, as Lee Dailey suggests, whose output can later be read with Import-Clixml.
    However, this format too has limitations with respect to type fidelity, as explained in this answer.

  • Saving a JSON representation of the data, as Lee also suggests, via ConvertTo-Json / ConvertFrom-Json, is another option, which makes for human-friendlier output than XML, but is still not as friendly as a plain-text representation; notably, all \ chars. in file paths must be escaped as \\ in JSON.


Writing a *.psd1 file that can be read with Import-PowerShellDataFile

Within the stated constraints regarding data types - in essence, anything that isn't a number or a string becomes a string - it is fairly easy to modify the code at the top to write a PowerShell hashtable-literal representation to a *.psd1 file so that it can be read back in as a [hashtable] instance via Import-PowerShellDataFile:

As noted, if you don't mind installing a module, consider the Configuration module, which has this functionality built int.

# Sample input hashtable.
$resumeParms = [ordered] @{ foo = 42; bar = 'baz'; arr = (Get-ChildItem *.txt) }

# Create a hashtable-literal representation and save it to file settings.psd1
@"
@{
$(
  ($resumeParms.GetEnumerator() |
    ForEach-Object { 
      "  {0}={1}" -f $_.Name, (
        $_.Value.ForEach({ 
          (("'{0}'" -f ($_ -replace "'", "''")), $_)[$_.GetType().IsPrimitive] 
         }) -join ','
      )
    }
  ) -join "`n"
)
}
"@ > settings.psd1

If you read settings.psd1 with Import-PowerShellDataFile settings.psd1 later, you'll get a [hashtable] instance whose entries you an access as usual and which produces the following display output:

Name                           Value
----                           -----
bar                            baz
arr                            {C:\Users\jdoe\file1.txt, C:\Users\jdoe\file1.txt, C:\Users\jdoe\file1.txt}
foo                            42

Note how the order of entries (keys) was not preserved, because hashtable entries are inherently unordered.

On writing the *.psd1 file you can preserve the key(-creation) order by declaring the input hashtable (System.Collections.Hashtable) as [ordered], as shown above (which creates a System.Collections.Specialized.OrderedDictionary instance), but the order is, unfortunately, lost on reading the *.psd1 file.

As of PowerShell 7.0, even if you place [ordered] before the opening @{ in the *.psd1 file, Import-PowerShellDataFile quietly ignores it and creates an unordered hashtable nonetheless.

mklement0
  • 382,024
  • 64
  • 607
  • 775
-1

This is a problem I deal with all the time and it drives me mad. I really think that there should be a function specifically for this action... so I wrote one.

function ConvertHashTo-CSV
{

Param (
    [Parameter(Mandatory=$true)]
    $hashtable, 
    [Parameter(Mandatory=$true)]
    $OutputFileLocation
    )

$hastableAverage = $NULL #This will only work for hashtables where each entry is consistent. This checks for consistency.
foreach ($hashtabl in $hashtable)
{
    $hastableAverage = $hastableAverage + $hashtabl.count #Counts the amount of headings.
}

$Paritycheck = $hastableAverage / $hashtable.count #Gets the average amount of headings

if ( ($parity = $Paritycheck -is [int]) -eq $False) #if the average is not an int the hashtable is not consistent
    { 
    write-host "Error. Hashtable is inconsistent" -ForegroundColor red
    Start-Sleep -Seconds 5
    return
    }

$HashTableHeadings = $hashtable[0].GetEnumerator().name #Get the hashtable headings 
$HashTableCount = ($hashtable[0].GetEnumerator().name).count #Count the headings

$HashTableString = $null # Strange to hold the CSV

foreach ($HashTableHeading in $HashTableHeadings) #Creates the first row containing the column headings
{
    $HashTableString += $HashTableHeading
    $HashTableString += ", "
}

$HashTableString = $HashTableString -replace ".{2}$" #Removed the last , added by the above loop in error

$HashTableString += "`n"


foreach ($hashtabl in $hashtable) #Adds the data
{

    for($i=0;$i -lt $HashTableCount;$i++)
        {
        $HashTableString += $hashtabl[$i]
            if ($i -lt ($HashTableCount - 1))
                {
                $HashTableString += ", " 
                }       
        }
    $HashTableString += "`n"
}

$HashTableString | Out-File -FilePath $OutputFileLocation #writes the CSV to a file

}

To use this copy the function into your script, run it, and then

ConvertHashTo-CSV -$hashtable $Hasharray -$OutputFileLocation c:\temp\data.CSV

The code is annotated but a brief explanation of what it does. Steps through the arrays and hashtables and adds them to a string adding the required formatting to make the string a CSV file, then outputs that to a file.

The main limitation of this is that the Hashtabes in the array all have to contain the same amount of fields. To get around this if a hashtable has a field that doesnt contain data ensure it contains at least a space.

More on this can be found here : https://grumpy.tech/powershell-convert-hashtable-to-csv/

FrankU32
  • 311
  • 1
  • 3
  • 18