1

I'm trying to start a program in a Start-job scriptblock using a variable for the path. Here is the line:

All the there variables work, whole line works when I use c:\plink in place of the $plink variable. It errors out on the -telnet so is not getting the arguments to plink.

Here is the $var's and job:

Thanks!

  • Share all the code (the definition of the variables as well as the `start-job` statement) – Santiago Squarzon May 04 '22 at 21:52
  • Try with this instead `& "$using:PlinkDir\plink.exe" -telnet...` (note the double-quotes) – Santiago Squarzon May 04 '22 at 22:23
  • Change `TimeStamp >> ..` for `TimeStamp | Set-Content -Encoding utf8 "$using:LogDi....` NOTE, this will replace the existing file if it does exists, if you want to append to an existing file, use `Add-Content` instead – Santiago Squarzon May 05 '22 at 00:53
  • `>>` is an alias for `Out-File -Append`, kinda different from `Set-Content` – Santiago Squarzon May 05 '22 at 01:56
  • So I understand you need to run this in a Job because you want to kill it at a given time while you do other stuff, and you expect to get the data outputted by `plink.exe` up to the time the Job was killed into the file. Is that a right assumption ? And if so, how are you killing the Job (are you using `Stop-Job`) ? – Santiago Squarzon May 05 '22 at 02:10

1 Answers1

0

This answer is based on some assumptions and a hunch of what might work for what is explained in your question and on comments.

First of all, to explain "It didn't lose any data when PowerShell was killed unexpectedly.", this is because >> (alias for Out-File -Append) is:

  1. Opening the file stream
  2. Appending output to the file
  3. Closing the stream

So, when you kill the job, what's there is still there basically. I did recommended you to use Set-Content but this was before I understood what you were doing, in this case, this wouldn't be an option.

The alternative proposed here is to use StreamWriter, which is nice because we can keep the file stream open and append to the file as needed without the need to close the stream each time (this will also take care of the "blank line between every line in the output to the log file"). To get around the killing the Job but still saving the results to the file we can use a try / finally statement.

$LogDir     = "c:\users\user" # Log file output directory
$PlinkDir   = "C:"            # plink.exe directory
$SerialIP   = "1.1.1.1"       # serial device IP address
$SerialPort = 10000           # port to log

function CaptureWeight {
    Start-Job -Name WeightLog -ScriptBlock {
        filter timestamp {
            $sw.WriteLine("$(Get-Date -Format MM/dd/yyyy_HH:mm:ss) $_")
        }

        try {
            $sw = [System.IO.StreamWriter]::new("$using:LogDir\WeightLog_$(Get-Date -f MM-dd-yyyy).txt")
            & "$using:PlinkDir\plink.exe" -telnet $using:SerialIP -P $using:SerialPort | TimeStamp
        }
        finally {
            $sw.ForEach('Flush')
            $sw.ForEach('Dispose')
        }
    }
}

$job = CaptureWeight     # For testing, save the job
Start-Sleep -Seconds 60  # wait 1 minute
$job | Stop-Job          # kill the job
Get-Content "$LogDir\WeightLog_$(Get-Date -f MM-dd-yyyy).txt" # Did it work?
Santiago Squarzon
  • 41,465
  • 5
  • 14
  • 37
  • 1
    @Retrotube my pleasure! I'm glad it worked, it was mostly based on guessing hehe. Reg. the "updating the file" after 2 minutes. You could set the `.AutoFlush` propeprty of `$sw` to `$true` that should take care of updating the file on each call of `.WriteLine`. I'm afk away right now, kinda hard to edit my code from the phone. You can ask a new question and if nobody answer I'll answer it when I'm back – Santiago Squarzon May 05 '22 at 21:13