2

I created a class to gather some data in a script, although not sure if this is the appropriate use of this. When I output class to text file it adds 2 blank lines each time it writes to the file. Is there a way to remove this?

[int] $numOut = 0
[int] $numIn = 0
[int] $numNone = 0
[int] $numCPE = 0
[int] $numSQR = 0
[int] $numEGX = 0
[int] $numCQA = 0

various parts of code do a self addition like this, these are the only types of manipulation to these variables

$script:numOut += 1
$cLength = $randString.Length #this is a random string
$numSQR = $numCPE + $cLength #add CPE + length of random strin
$total = $numOut + $numIn + $numNone + $numCPE + $numSQR + $numEGX + $numCQA

class Logging {
    [string]$DateTime
    [string]$User
    [string]$numOut
    [string]$numIn
    [string]$numNone
    [string]$numCPE
    [string]$numSQR
    [string]$numEGX
    [string]$numCQA
    [string]$total
}

$Logging = [Logging]::new()
$Logging.DateTime = Get-Date
$Logging.User = $env:username
$logging.NumOut = $numOut
$logging.NumIn = $numIn
$logging.NumNone = $numNone
$logging.NumCPE = $numCPE
$logging.NumSQR = $numSQR
$logging.NumEGX = $numEGX
$logging.NumCQA = $numCQA
$logging.Total = $total


write-output $logging | Format-Table -AutoSize -HideTableHeaders >> $CWD\log.txt

It writes to the file like this:

arealhobo 10/24/2020 19:47:24 1      0     1       1     1      0             1


arealhobo 10/24/2020 19:50:37 1      0     1       1     1      0             1


arealhobo 10/24/2020 19:53:15 1      0     1       1     1      0             1
arealhobo
  • 447
  • 1
  • 6
  • 17
  • Just to clarify: Your sample code writes a _single_ `Logging` instance, but your output suggests that _multiple_ objects were written, i.e. that a _collection_ of `Logging` instances served as the input. The way the sample output presents in your question is only _half_ explained by mixing single-byte and double-byte character encodings, so please clarify if this representation matches what you actually see, and whether you're viewing it via `Get-Content` in the console. As shown, one explanation could be that your input collection contains empty strings between the `Logging` instances. – mklement0 Oct 25 '20 at 16:29

3 Answers3

3

You can replace the newlines first:

(write-output $logging | Format-Table -AutoSize -HideTableHeaders | Out-string) -replace "\n","" >> $CWD\log.txt
Wasif
  • 14,755
  • 3
  • 14
  • 34
1

You could also implement a method to handle outputting to a file. Here's an example.

class Logging {
    [string]$DateTime
    [string]$User
    [string]$numOut
    [string]$numIn
    [string]$numNone
    [string]$numCPE
    [string]$numSQR
    [string]$numEGX
    [string]$numCQA
    [string]$total

    Log($file){
        $this | Export-Csv -Path $file -Delimiter "`t" -Append -NoTypeInformation
    }
}

$Logging = [Logging]::new()
$Logging.DateTime = Get-Date
$Logging.User = $env:username
$logging.NumOut = $numOut
$logging.NumIn = $numIn
$logging.NumNone = $numNone
$logging.NumCPE = $numCPE
$logging.NumSQR = $numSQR
$logging.NumEGX = $numEGX
$logging.NumCQA = $numCQA
$logging.Total = $total

Now you can simply call $logging.log("path\to\logfile") specifying where to write.

$Logging.log("c:\Some\Path\logging.log")
Doug Maurer
  • 8,090
  • 3
  • 12
  • 13
1

Note: The scenario described below may not match the OP's. The answer may still be of interest if you find that file content prints as follows to the consoler after having used >> to append to a preexisting file in Windows PowerShell; note what appears to be extra spacing and extra empty lines:

enter image description here


To avoid your problem, which most likely stems from an unintended mix of different character encodings in the output file produced by >>, you have two options:

  • If you do know the character encoding used for the preexisting content in the output file, use Out-File -Append and match that encoding via the -Encoding parameter:
# Using UTF-8 in this example.
$logging | Format-Table -AutoSize -HideTableHeaders |
  Out-File -Append -Encoding Utf8 $CWD\log.txt

Note that > / >> are in effect like calling Out-File / Out-File -Append, except that you don't get to control the character encoding.

  • In the unlikely event that you don't know the preexisting character encoding, you can use Add-Content, which matches it automatically - unlike >> / Out-File -Append - but that requires extra work:

    • An additional Out-String -Stream call is needed beforehand, to provide the formatting that >> (and > / Out-File) implicitly provide; without it, Add-Content (and Set-Content) apply simple .ToString() stringification of the output objects, and in the case of the objects output by Format-* cmdlets that results in useless representations, namely their type names only (e.g., Microsoft.PowerShell.Commands.Internal.Format.FormatStartData):
# Add-Content, unlike >>, matches the character encoding of the existing file.
# Since Add-Content, unlike > / >> / Out-File, uses simple .ToString()
# stringification you first need a call to `Out-String`, which provides
# the same formatting that > / >> / Out-File implicitly does.
$logging | Format-Table -AutoSize -HideTableHeaders |
  Out-String -Stream | Add-Content $CWD\log.txt

Read on for background information.


Assuming you're using Windows PowerShell rather than PowerShell [Core] v6+[1]:

The most likely cause (the explanation doesn't fully match the output in your question, but I suspect that is a posting artifact):

  • You had a preexisting log.txt file with a single-byte character encoding[2], most likely either the legacy encoding based on your system's active ANSI code page or a UTF-8 encoded file (with or without a BOM).

  • When you appended content with >>, PowerShell blindly used its default character encoding for > / >>, which in Windows PowerShell[1] is "Unicode" (UTF-16LE), which is a double-byte encoding[2] - in effect (but not technically) these redirection operators are aliases for Out-File [-Append].

The result is that the newly appended text is misinterpreted when the file is later read, because the UTF-16LE characters are read byte by byte instead of being interpreted as the two-byte sequences that they are.

Since characters in the ASCII range have a NUL byte as the 2nd byte in their 2-byte representation, reading the file byte byte sees an extra NUL ("`0") character after every original character.

On Windows[3], this has two effects when you print the file's content to the console with Get-Content:

  • What appears to be a space character is inserted between ASCII-range character so that, say, foo prints as f o o - in reality, these are the extra NUL characters.

  • An extra, (apparently) empty line is inserted after every line, which is a side effect of PowerShell accepting different newline styles interchangeably (CRLF, LF, CR):

    • Due to the extra NULs, the original CRLF sequence ("`r`n") is read as "`r`0`n`0", which causes PowerShell to treat "`r" and "`n" individually as newlines (line breaks), resulting in the extra line.

    • Note that the extra line effectively contains a single NUL, and that the subsequent line then starts with a NUL (the trailing one from the "`n"), so among the misinterpreted lines all but the first one appear to start with a space.


[1] PowerShell [Core] v6+ now consistently defaults to BOM-less UTF-8 across all cmdlets. While >> (Out-File -Append) still don't match an existing encoding, the prevalence of UTF-8 files makes this less of a problem. See this answer for more information about character encoding in PowerShell.

[2] Strictly speaking, UTF-8 and UTF-16 are variable-length encodings, because not every byte in UTF-8 is necessarily its own character (that only applies to chars. in the ASCII range), and, similarly, certain (exotic) characters require two 2-byte sequences in UTF-16. However, it is fair to say that UTF-8 / UTF-16 are single/double-byte-based.

[3] On Unix-like platforms (Linux, macOS) you may not even notice the problem when printing to the terminal, because their terminal emulators typically ignore NULs, and, due to LF ("`n") alone being used as newlines, no extra lines appear. Yet, the extra NULs are still present.

mklement0
  • 382,024
  • 64
  • 607
  • 775
  • I am using Powershell 5.1 and 7.0, the issue persists in both versions with >>. Now when I try using Add-Content, it outputs this block of text `Microsoft.PowerShell.Commands.Internal.Format.FormatStartData Microsoft.PowerShell.Commands.Internal.Format.GroupStartData Microsoft.PowerShell.Commands.Internal.Format.FormatEntryData Microsoft.PowerShell.Commands.Internal.Format.GroupEndData` Setting the encoding does not change this, I cannot explain why its doing this. – arealhobo Oct 27 '20 at 17:05
  • @arealhobo: That was my bad: I forgot that a formatting cmdlet was involved - please see the update to the top section of this answer. – mklement0 Oct 27 '20 at 17:27
  • @arealhobo: In short: use `Out-File -Append -Encoding `. I've also added a more detailed explanation of the output you saw with just `Add-Content` alone. – mklement0 Oct 27 '20 at 17:41
  • @arealhobo, did my most recent suggestion resolve the issue for you? It would really be helpful to future readers as well to bring closure to this question. If the problem is what I described, future readers would not be helped by the currently higher-voted answers: one neither acknowledges nor addresses the `NUL` problem, and the other bypasses the problem by proposing an inefficient per-object solution that opens and closes the output file for each object. – mklement0 Oct 28 '20 at 01:28
  • Unfortunately not, the other solution proposed to do a string replace of new lines works, but I feel like its a workaround rather than targeting the core issue. With Out-File and Out-String, the output is the same, appending to an existing file with certain encoding or creating a new file. I did review my code to possibly find empty strings as you mentioned on your comment to my post but did not find anything. My code does use combinations of Out-File and Add-Content is various part of the script but are not affected by this issue I am encountering here. – arealhobo Oct 28 '20 at 02:28
  • @arealhobo: If there are no `NUL` values in your output file - verify with `(gc -raw $CWD\log.txt) -match "\`0"` - then my answer definitely doesn't apply. Can you post a _minimal_ example with 2 sample objects that reproduces the problem? – mklement0 Oct 28 '20 at 02:38
  • @arealhobo. P.S.: The code in your question uses variables to assign the property values, and we don't know what these variables contain; if I fill in literals for testing, I do _not_ see your symptom. If I embed _newlines_ in the property values, they render with an ellipsis (`...`) instead of causing actual line breaks. – mklement0 Oct 28 '20 at 02:47
  • I just added some more code, not sure how help that would be. – arealhobo Oct 28 '20 at 05:44
  • Thanks, @arealhobo. The idea was to provide a [mcve], so that those trying to help can themselves reproduce the problem. I tried with your updated code - using `write-output $logging, $logging` to simulate _multiple_ instances, as your output implies - and I do _not_ see the problem. If you run _just_ the code that is now in your question, do you see it? If not, we haven't found a minimal example yet. Also, can you confirm that there are _no_ `NUL` values in the output file, based on `(gc -raw $CWD\log.txt) -match "\`0"`? What PowerShell version are you using? – mklement0 Oct 28 '20 at 13:53
  • This was in PowerShell Core 7.0, it does not produce any nul values. I will need to revisit this now with a fresh pair of eyes. – arealhobo Dec 01 '20 at 09:27