0

I am Very new to Powershell scripting.I have script to get folder details from share network path as below. (eg.\\INTBMCELB\Transport_Admin)

$infile = 'C:\Paths\PathList.txt'

$outdir = 'C:\Export\'

foreach ($dir in (Get-Content $infile)) 

{

$outfile =  Join-Path $outdir($dir -split '\\')[-1]

$outfilecsv=$outfile+'.csv'

Get-ChildItem -Path $dir -Filter *.* -Recurse | Select-Object FullName,CreationTime,@

{Name="Size";Expression={$_.Length}

} | Export-Csv -Path $outfilecsv -Encoding ascii -NoTypeInformation}

In each network path which is reading from text file, contain lots of data. There are around 2500 share paths needs to read. Currently it takes around 2 days time to complete it. Can I improve the performance better?

Can I add any changes to above script which makes execution time less than the current total execution time?

Taylor Hx
  • 2,815
  • 23
  • 36
user3480406
  • 11
  • 1
  • 8
  • The bottleneck is probably not in the script but in the amount of slow jobs. I would recommend to do this in parallel, especially if network paths are on different machines. Take a look at this answer for some tools: http://stackoverflow.com/a/21520735/323582 – Roman Kuzmin Apr 16 '14 at 06:56
  • 1
    I also recommend to check for existence of `$dir` before doing `Get-ChildItem` on it. Because in PS v3.0+ on a missing path `Get-ChildItem -Recurse` searches in the current location instead (weird!). And if it is, say, C:\, it takes ages, indeed, and does not what one wants, more likely. – Roman Kuzmin Apr 16 '14 at 07:04

0 Answers0