0

Couldn't find an answer to this Q throughout the forum. Is there a way to read for example 5 txt files instead of just one ? I want to access the Disk less times.

Etheryte
  • 24,589
  • 11
  • 71
  • 116
UserED
  • 33
  • 10

5 Answers5

0

No, that is not possible. If you want to reduce your disk IO, maybe work with one file and seek through it to access different areas. Look at this SO article: c# - Read specific bytes of an file

Community
  • 1
  • 1
BendEg
  • 20,098
  • 17
  • 57
  • 131
0

Just read the five files. Long gone are the days when there was any sort of direct causal relationship between I/O issued by your program and what happens at the disk. Caches (yes, plural), command queuing, predictive fetching, and other strategies all mean that the best thing you can do for performance is just to tell the OS what you want, and let it figure out the best way to achieve that.

(And no, it's not possible to do precisely what you want anyway)

Peter Duniho
  • 68,759
  • 7
  • 102
  • 136
0

Not possible... Read the file one after another... Do not complicate your life thinking of performance in this scenario.. :-)

Good luck...

Ashwath
  • 407
  • 3
  • 10
0

Depending on what you want to do on the files, you either use 5 lines of code to readall lines.

Or you open a streamreader and do the reading in a for loop, so you read them all on the same line. This won't be "at once at once" but pretty close.

The last option you have is really using multi-threading, You can read all files async and fire the reading proccess the one after the other on a different thread.

But keep in mind that if reading from the same storage it won't give you any extra performance in doing so! You will only get a perfomance boost if reading on other storage, maybe on storage using RAID and if you are reading file large enough to have a decent read time in order to even feel the performance gain.

YouryDW
  • 393
  • 1
  • 7
0

Disc access is most likely not your primary concern when reading strings from a text file due to the overhead of garbage collection. See How to parse a text file in C# and be io bound?

You should more focus on reusing buffers and avoiding strings if you want to max out your disc with read requests. Additionally you can issue read requests in parallel hoping that the disc queue for reads by the raid controler is optimized so that your random reads from different files are transformed into large sequential reads (e.g. native command queueing is such a feature). To make this optimizations happen you need to enqueue enough read requests in parallel (but not too many) to max out a raid controler.

In reality it matters a lot if you read a file once or serveral times since most of the time all subsequent reads are delivered from the file system cache wich is basically a CopyMemory operation.

If you want to get even faster you should look into MemoryMapped Files where the actual contents of the file system cache are mapped into your process space which spares you one memory copy operation from kernel to user mode buffers.

The first rule should be to measure before you optimize. Otherwise you will spend a lot of time and effort in places which are not an issue at all.

Community
  • 1
  • 1
Alois Kraus
  • 13,229
  • 1
  • 38
  • 64