0

I have an large string buffer, and a input stream

basic_string<uint8_t> *buf = ......;
istream in = ......;

What is the most efficient way to read a part of the file into the string? Say, the 0xE3CC'th to 0x1A481'th bytes from the file.

Here istream::read seems not an answer since it reads to a raw char[]. Since the data is quite large, having a temporary variable is [in]efficient.

And sadly, I don't have C++0x, so copy_n can't be used. What would you suggest? Thanks.

  • *"stream::read seems not an answer since it reads to a raw char[]."* Make the string big enough then pass `&buf[0]` and you're good to go. – jrok Oct 22 '13 at 10:34
  • But is a string in C++ guaranteed to be stored contiguously? –  Oct 22 '13 at 10:38
  • @user2139538: technically - yes, please read here: http://stackoverflow.com/questions/1986966/does-s0-point-to-contiguous-characters-in-a-stdstring – Andriy Tylychko Oct 22 '13 at 10:41
  • Since C++11, it is. Before that, it wasnt' officialy but IIRC, all implementations did it anyway. – jrok Oct 22 '13 at 10:41
  • `since it reads to a raw char[]. Since the data is quite large, having a temporary variable is [in]efficient.` You're supposed to read it in chunks, into a buffer that you don't create and destroy over and over again. This is the efficient approach. – Lightness Races in Orbit Oct 22 '13 at 11:10
  • Actually, slightly more efficient way (depending on how you access the data in the chunk) would be to memory map the chunk of the file as required... – Nim Oct 22 '13 at 12:59

1 Answers1

1
buf->resize(size);
in.read(&((*buf)[0], size);

BTW, do you really need buf to be a pointer?

Andriy Tylychko
  • 15,967
  • 6
  • 64
  • 112