0

I try to write a program used to Compressed file, but I get different output everytime I run the program. when I debug this program, I find that std::ifstream will get different results between Release and Debug. And this is my program:

#include <iostream>
#include <fstream>
using namespace std;

int main() {
    ifstream inputfile("testing_file.txt");

    inputfile.seekg(0, ios::end);

    unsigned long filesize = inputfile.tellg();

    inputfile.seekg(0, ios::beg);

    char* buffer = new char[filesize];

    inputfile.read(buffer, filesize);

    for (int i = 0; i < filesize; ++i) {
        cout << (int)*(buffer + i) << endl;
    }

    delete[] buffer;
    inputfile.close();

    cin.get();

    return 0;
}

this is the testing_file.txt :

for test !

I get this when using Release mode :

102
111
114
32
116
101
115
116
32
33
10
0

but I get this in Debug mode :

102
111
114
32
116
101
115
116
32
33
10
-51

so I went to ask why I will get two different output in the same program.

sorry for my bad English.

R Sahu
  • 204,454
  • 14
  • 159
  • 270
Emc Java
  • 21
  • 3
  • The debug heap in Visual Studio, which I assume you are using, sets memory to certain sentinel values to aid with debugging. 0xcd is newly allocated heap memory. If you clear your buffer to 0 before you read you'll see the same results. https://stackoverflow.com/questions/370195/when-and-why-will-an-os-initialise-memory-to-0xcd-0xdd-etc-on-malloc-free-new – Retired Ninja Jul 25 '18 at 05:33

2 Answers2

1

For debug builds the memory pointed to by buffer gets magically filled with 0xCD, in release builds it does not. Also why is there manual memory management in your code? Use a std::vector instead.

Its also useless to .close() the file near the end of a block since the destructor of *fstream will do it for you anyway. RAII is great.

Another thing you should be aware of is that finding a files size the way you do won't reliably work for files opened in text mode where a newline could be either '\n' or "\r\n".

inputfile.seekg(0, ios::end);
unsigned long filesize = inputfile.tellg();

will account for the '\r', inputfile.read(buffer, filesize); will not.

No bullshit code:

#include <iostream>
#include <fstream>
#include <vector>
#include <iterator>
#include <cstdlib>

int main()
{
    char const * const filename{ "testing_file.txt" };
    std::ifstream inputfile{ filename };

    if (!inputfile) {
        std::cerr << "Error opening file \"" << filename << "\"!\n\n";
        return EXIT_FAILURE;
    }

    inputfile >> std::noskipws;
    std::vector<char> buffer{ std::istream_iterator<char>(inputfile), std::istream_iterator<char>() };

    for (auto & ch : buffer)
        std::cout << static_cast<int>(ch) << '\n';
}
Swordfish
  • 12,971
  • 3
  • 21
  • 43
1

It is also worth commenting on why that extra byte is printed in the first place. Your program is correct in its attempt to determine the filesize, allocate the correct sized buffer, read the correct number of bytes, and then only print that many out.

Why then is an extra byte (0 or -53) printed?

If you take a hex editor to your file, you'll see the hex values:

66 6f 72 20 74 65 73 74 20 21 0d 0a

Note the 0d 0a on the end. When you call seekg(), it'll seek to the end, and the result from tellg() will be 12. However, the file is open in text mode, and under Windows this means that the 0d 0a combination on the end of the line is reduced to a single 0a to maintain compatibility with the *nix way of doing things.

So the read() only reads 11 bytes, thus when you print a total of 12, the last byte is whatever was in the buffer. @Swordfish's answer explains very well why it's zero vs -53 depending on the build.

Note that when those hex values are converted to decimal they exactly match the output from your program, except the 0d is missing, that would be a 13 between the 33 and the final 10.

If you open the file in binary mode, that extra byte will appear and the changed byte will vanish: both builds will now produce the same output.

dgnuff
  • 3,195
  • 2
  • 18
  • 32