5

I have a perl script that is used to monitor databases and I'm trying to write it as a powershell script.

In the perl script there is a function that reads through the errorlog and filters out what it is that matters and returns it back. It also saves the current position of the log file so that the next time it has to read the log it can start where it left of instead of reading the whole log again. This is done by using the tell function.

I have an idea to use the Get-Content cmdlet, and start reading at the last position, process each line until the end of the file and then saving the position.

Do you know any tricks so that I can get the position in the log file after reading and make the read start at a particular location.

Or is there an better and/or easier way to achieve this?

Gísli

EDIT: This has to be done through the script and not with some other tool.

EDIT: So I'm getting somewhere with the .NET API but it's not quite working for me. I found a helpful links here and here

Here is what I have so far:

function check_logs{
param($logs, $logpos)
$count = 1
$path = $logs.file
$br = 0
$reader = New-Object System.IO.StreamReader("$path")
$reader.DiscardBufferedData()
$reader.BaseStream.Seek(5270, [System.IO.SeekOrigin]::Begin)
for(;;){
    $line = $reader.ReadLine()
    if($line -ne $null){$br = $br + [System.Text.Encoding]::UTF8.GetByteCount($line)}
    if($line -eq $null -and $count -eq 0){break}
    if($line -eq $null){$count = 0}
    elseif($line.Contains('Error:')){
        $l = $line.split(',')
        Write-Host "$line  $br"
    }
}

}

I haven't found a way to use the seek function correctly. Can someone point me in the right direction?

If I run this it outputs 5270 but if I run this with out the line where I try to seek in the base stream I get:

2011-08-12 08:49:36.51 Logon       Error: 18456, Severity: 14, State: 38.  5029
2011-08-12 08:49:37.30 Logon       Error: 18456, Severity: 14, State: 38.  5270
2011-08-12 16:11:46.58 spid18s     Error: 1474, Severity: 16, State: 1.  7342
2011-08-12 16:11:46.68 spid18s     Error: 17054, Severity: 16, State: 1.  7634
2011-08-12 16:11:46.69 spid29s     Error: 1474, Severity: 16, State: 1.  7894

Where the first part is the line read from the log and the number at the end represents the bytes read at that point. So as you can see I'm now trying to use the seek function to skip the first error line but as I said earlier the output is 5270 if I use the seek function.

What am I missing?????

Gísli

Community
  • 1
  • 1
Gisli
  • 734
  • 2
  • 11
  • 34
  • +1 for mentioning the [tell](http://perldoc.perl.org/functions/tell.html) function. I hadn't heard of it before. Just wondering, do you expect rewriting the script to be less difficult than installing Perl or is there a requirements issue involved? – Christopher Bottoms Aug 11 '11 at 14:44
  • The rewrite is hell if that's what you're asking. The thing is that we monitor databases for others and this would make the setup easier at new windows servers in the future. I should also mention that I'm a sort of an intern this summer and this project is also a way to let me learn the code and understand what the scripts are doing. And they also wanted someone to start learning powershell. So in short, it would probably be easier to install perl and use that script. – Gisli Aug 11 '11 at 14:53
  • not quite sure what you're asking for at the moment - i'll have another read but check your variables - $l is the return of the split but you are writing $line - did you mean that? – Matt Aug 17 '11 at 17:44

3 Answers3

2

you would probably be able to do this with some .net objects etc...

If it's a more standard formatted log file I don't look much past logparser though. It was awesome before time and is still awesome!

You can use it via the command line or COM with PowerShell. It has the ability to mark where it was in a file and pick up from there (stores the info in a lpc file).

May be someone will come up with a good way of doing this but if not you could also look at switching to writing error information to the event viewer. You can store the last id or last time you searched the event viewer and check from there each time.

hopefully there is something better...

EDIT:

If the file is tab delimited you can use the import-csv command and store the last number (it'd either be count or count-1 if the the header is included in count). With the last number you can jump to the last point in the file

# use Import-CliXML to get the $last_count
Import-CliXML $path_to_last_count_xml
$file = Import-Csv filename -delimiter "`t"
for ($i=$last_count; $i -gte $file.count; $i++) {
    $line = $file[$i]
    # do something with $line
    ...
}
$last_count = $file.count | Export-CliXML $path_to_last_count_xml

# use this to clear the memory
Remove-Variable $file
[GC]::collect

or you could directly query the DB using sp_readerrorlog; using the last time like the last count above.

Matt
  • 1,931
  • 12
  • 20
  • Thanks. If I understand you correctly the logparser is a tool that I would need to download and install. The reason why the script is being rewritten in powershell is so that we don't have to install perl on new windows servers we are monitoring. So even though this is a good tool we could just as well install perl on the machine and skip the rewrite. Thanks anyway – Gisli Aug 11 '11 at 14:34
  • This is the errorlog that MSSQL server creates. It is a plain text file with no extension(file type 'file') – Gisli Aug 11 '11 at 15:01
  • i think the file is tab delimited so I've added some code to do that, you could always use the loop for a normal text file as well (using Get-Content) to load the file. – Matt Aug 12 '11 at 05:29
  • Thanks, this might be something I can use. I'm working on it and I'll let you know – Gisli Aug 12 '11 at 09:31
  • This works but it is very slow when the file gets large because Import-Csv imports the whole file before searching – Gisli Sep 01 '11 at 11:12
1

If your objective is to only read the file from the last line read in the prior run Please feel free to disregard this answer. However if you’re just trying to get any errors since the last check time this might help.


[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO')  | Out-Null
$server = 'ServerName'
$chkDate = Get-Date -Date '8/16/2011 15:00'  # time of last check
$srvObj =  New-Object -TypeName Microsoft.SqlServer.Management.Smo.Server -argumentList $srv
$srvObj.ReadErrorLog(0) | foreach { if ($_.LogDate -notlike '' `
   -and $_.LogDate -ge $chkDate `
   -and $_.Text -like 'Error: *') {$_}} |ft -AutoSize

If you pick-up the last run time from a file, or just know you run this every hour or whatever, you can adjust the $chkDate to only show errors from then to the end of the file.

(watch out for those back-ticks (`) at the end of the $srvObj.ReadErrorLog(0) line and the next line. They don't always come out for me in the HTML)

Bruce
  • 1,633
  • 1
  • 11
  • 12
  • Thanks for the reply. The objective is to only read the file. But do you have a link so that I can read a little bit more about the functions you're using. Like ReadErrorLog, LogDate, Text. I'm afraid your code is a little bit to complicated for me at this point but I would like to understand it. – Gisli Aug 17 '11 at 09:13
  • You might get something out of this article at Hey, Scripting Guy http://blogs.technet.com/b/heyscriptingguy/archive/2011/05/31/use-powershell-to-get-the-sql-server-error-log.aspx The SMO version is almost at the bottom of the page. There's also a link to the Server class from the Microsoft.SQLServer.SMO .NET Framework at MSDN. – Bruce Aug 17 '11 at 15:15
0

There are two options I would suggest if you really want to go with PowerShell.

  1. Use the .NET file APIs.

  2. Read the whole contents of the log and then clear it. Store the parsed contents in a database.

JasonMArcher
  • 14,195
  • 22
  • 56
  • 52