0

I am going through a implementation of a LZSS decompression algorithm where there is a buffer of 4096 chars (or whatever size you like). This implementation outputted the file to a char* where as I wanted to output a file using std::ofstream. I did get this to work but in a weird/quirky fashion (at least to me). It has something to do with the assign operator from what I can tell. All the types are the same (char)

If have

outputFileStream.write((char *) &buffer[byteIndex1++ & 0xFFF]);
buffer[byteIndex2++ & 0xFFF] = buffer[byteIndex1 & 0xFFF];

this will fail and give me corrupt data but if I have this

char temporary;

buffer[byteIndex2++ & 0xFFF] = temporary = buffer[byteIndex1++ & 0xFFF];

outputFileStream.write((char *) &temporary, 1);

that will work. Am I not understanding the order of operations that are taking place? (Execution of operations right to left) If I am then would not those two code snippets work the same?

brad_c6
  • 23
  • 1
  • 6
  • 1
    possible duplicate of [Could anyone explain these undefined behaviors (i = i++ + ++i , i = i++, etc...)](http://stackoverflow.com/questions/949433/could-anyone-explain-these-undefined-behaviors-i-i-i-i-i-etc). Read [this one](http://stackoverflow.com/questions/4176328/undefined-behavior-and-sequence-points), too. – jrok Jul 28 '13 at 20:23
  • So in my case the undefined behavior is (byteIndex# & 0xFFF) or the assignment themselves? So I need to play with the order to figure out what the ordering should be. – brad_c6 Jul 28 '13 at 21:16
  • Here is the same code defined properly (I think) byteIndex1 &= 0xFFF; byteIndex2 &= 0xFFF; outputFileStream.write((char *) &buffer.at(byteIndex1), 1); buffer.at(byteIndex1) = (buffer.at(byteIndex2)); byteIndex1++; byteIndex2++; – brad_c6 Jul 28 '13 at 21:40

0 Answers0