0

I am trying to convert .etl file into .txt file. Rightnow, I am using the following command to get .txt files from .etl file:

Get-WinEvent -Path $Path -Oldest -ErrorAction SilentlyContinue -ErrorVariable errors | ForEach-Object { "{0},{1},{2},{3},{4}" -f $_.TimeCreated.ToString("yyyy-MM-ddTHH:mm:ss.ffffff"), $_.Id,$_.Level,$_.ProviderName,$_.Message } `
| Add-Content -Path $LogFilePath

However, the .etl file is quite huge and takes about a hour to complete.

I was wondering if there's any other way to convert those etl file in txt file without much overhead. I tried looking into tracerpt tool,however it only converts .etl file into .csv/.xml files.

  • 2
    If ***tracerpt *** is faster, why not use that? The other file types you state are just text that you can further convert or just rename the csv to txt. So, I going to assume, you tried that, and decided you want a one-pass effort? Also, are you saying the table format of the csv is unreadable and the XML is overloaded? when you are serializing large datasets/files, you should expect time impacts. [Have looked at this as well](https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/pktmon-etl2txt) – postanote Jul 29 '22 at 19:30
  • 1
    why do you want to convert to txt? Use a proper tool to interact with the etl instead. Opening and filtering directly on the binary format is obviously much faster – phuclv Jul 30 '22 at 01:37

2 Answers2

0

Perhaps this is not the answer you're looking for, but I recommend nonetheless.
Unfortunately, there is no out-of-the-box .NET way of doing this.
Event tracing can be involving, but if you figure it out, you will gain on performance.

Microsoft have a couple of examples on how to read .etl files using C++ and the native APIs.

check this out:
https://learn.microsoft.com/en-us/windows/win32/etw/using-tdhformatproperty-to-consume-event-data

FranciscoNabas
  • 505
  • 3
  • 9
-1

The .Net System.IO calls seem to be faster than the Get-Content & Set-Content. Doing a straight CSV to TXT conversion was almost twice as fast and should be for any other formats you throw at it:

# $file1 is a 71,070 line CSV
$file1 = "C:\Users\Username\Desktop\_test.csv"
$file2 = "C:\Users\Username\Desktop\_test.txt"

#### Test 1 - Get-Content -Raw
$start1 = Get-Date
    Get-Content $file1 -Raw | Set-Content $file2
$end1 = Get-Date

#### Test 2 - .NET's System.IO
$start2 = Get-Date
    $txt = [System.IO.File]::ReadAllText("$file1")
    [System.IO.File]::WriteAllText("$file2", $txt)
$end2 = Get-Date

New-TimeSpan –Start $Start1 –End $End1
New-TimeSpan –Start $Start2 –End $End2

# TotalMilliseconds : Attempt [1] 109.3493 [2] 93.7438 [3] 78.1255
# TotalMilliseconds : Attempt [1]  49.4906 [2] 46.8493 [3] 46.8738

.NET was also faster when doing string manipulations such as modifying the time & date format like you have in your example. For a better breakdown, check out all the Stack O question & answers here

DBADon
  • 449
  • 5
  • 9