3

I am trying to get the file size of large files (some are over 1 gig) using FileInfo. It works, but it takes 20 seconds or so. Really, all I need is to figure out if the file is over a certain size, for example 100mb. Is there a quicker way to do this?

New System.IO.FileInfo(ProcessPath).Length

Edit: BTW, the file I am using to test is a 1.6 gig executable installer. So I am guessing that each file in the installer is being read, and that is why it is taking so long. Is there any way to time out after 5 seconds or so, since if it takes longer than 5 seconds, we can safely assume that it is a large file?

Dan
  • 39
  • 7
  • Why don't you use cmd.exe and parse the list? just my 2 cents.. – ken lacoste Jun 12 '15 at 16:20
  • [Some alternatives you might try](http://stackoverflow.com/questions/14407954/get-file-size-without-using-system-io-fileinfo) – Chris Jun 12 '15 at 18:01
  • I can look into cmd.exe if needed, thank you. Thank you Rfvgyhn, I tried these yesterday and had the same result. I think it is slow because the 1.6 gig file is basically a zipped file. So maybe I should have .net check to see if the file is a zipped file first (or something like that)? – Dan Jun 12 '15 at 18:14
  • This isn't really possible, the file size is read from the directory entry. Can't take more than ~20 msec worst case unless you are doing this over a really pokey network connection. Tinkering with executable files is never not a problem on machines today, disable your anti-malware or make an exclusion and try again. – Hans Passant Jun 12 '15 at 18:36
  • Thank you for everyone's help. The problem was that that the file was in use. It works fine when the file is not in use. Is there anyway to get the file size when the file is in use? – Dan Jun 13 '15 at 06:17

1 Answers1

0

To rebuild the Windows Search index, head back to Control Panel > Indexing Options. Click the Advanced button and make sure you’re on the Index Settings tab of the Advanced Options

http://www.tekrevue.com/tip/how-to-solve-windows-search-issues-index-rebuild/

  • Did the code in the question test slow for you? And were you testing on files over 1 gig? Just curious because this looks like it is getting a FileInfo object in a slightly different way but ultimately still doing the same thing so I'm surprised if it performs better than the original... – Chris Jun 12 '15 at 16:39
  • i thought the same, so i actually tested, it ran instantly 40 gig file – Michael Rudner Evanchik Jun 12 '15 at 16:43
  • Nice. And did the OPs code run slowly? Just wondering if it might be related more to the environment rather than the code (eg the file in question is on a network drive or something that is just slower to do everything). – Chris Jun 12 '15 at 16:54
  • did not test op's, could be a microsoft search cache index thing, so your right – Michael Rudner Evanchik Jun 12 '15 at 17:21
  • Thank you, but it is the same speed as the original code. The large file is on my C drive, which is an SSD drive, so we do not have to worry about bottlenecks. And it happens on every computer I try it on. – Dan Jun 12 '15 at 17:24
  • reads are faster on a SSD, it was probably setup formated wrong. there is more to just format these days with SSD's – Michael Rudner Evanchik Jun 12 '15 at 17:28
  • Yeah, but it happens on every machine I try it on. FYI, I edited my question if you are interested. – Dan Jun 12 '15 at 17:58
  • diskpart cmds need to be executed after format. but anyway you can try clean up (dont do the formatting obviously) with dispart http://knowledge.seagate.com/articles/en_US/FAQ/005929en?language=en_US also try optimization (where de-fragmentation used to be) , also again, windows search indexes all your files location and sizes when your idle, look into letting microsoft search service do a full run, and then see if it takes as long, each machine has its own database file – Michael Rudner Evanchik Jun 12 '15 at 18:02