1

I have a program that writes information to text files, then subsequently reads those text files to access the data once again. I've compiled the binary and every seems to work except that I get a "Debug Error! ... R6010 - abort() is called" error after reading a certain number of them back into the program.

The relevant part of the read function is below:

// file_name is passed by reference in as a parameter of type const std::string&
std::fstream text_file(file_name, std::ios_base::in | std::ios_base::binary);
CHECK(text_file.is_open()) << file_name;

// height_, width_, and depth_ are class variables of type size_t
char unused_char;
text_file >> width_ >> unused_char >> height_ >> unused_char >> depth_ >>
    unused_char;
std::streampos pos = text_file.tellg();
text_file.close();

// Assertions that width_ height_, and depth_ are positive
CHECK_GT(width_, 0);
CHECK_GT(height_, 0);
CHECK_GT(depth_, 0);

// data is a class variable of type std::vector<T>
data_.resize(width_ * height_ * depth_);

// THIS IS WHERE IT DIES
std::fstream binary_file(file_name,
                           std::ios_base::in | std::ios_base::binary);
CHECK(binary_file.is_open()) << file_name; // 

...

binary_file.close()

I can't seem to figure out what's going on. Everything works fine for the first 83 files, then on file 84 I get this error.

What I've tried:

  • At first I was writing and then reading (and subsequently writing again to) the same file. Thinking perhaps it was a locking issue, the second write has a different prefix. Didn't fix it.
  • Originally, the first write had been done on a Linux machine (I moved the project). I thought maybe it was an issue with line endings, so I regenerated the files on Windows. Didn't fix it. (Although it's worth noting I didn't have any problems running this on Ubuntu 14.04)
  • I though perhaps file 84 was the issue, so I had the program skip it and go to file 85. Still threw the error. When I reduce the total number of files to 50, it runs fine. This suggests it has something to do with the number of files and not the contents.
  • I considered that it might be a file handle issue, but all opened streams are closed within this function
  • I tried debugging in Visual Studio 2013. At the error's breakpoint, the variables were:

    binary_file {_Filebuffer={_Set_eback=0xcccccccc <Error reading characters of string.> _Set_egptr=0xcccccccc <Error reading characters of string.> ...} }    std::basic_fstream<char,std::char_traits<char> >
    data_   { size=0 }  std::vector<float,std::allocator<float> >
    depth_  3   unsigned int
    file_name   "correct\\path\\to\\file84.txt" const std::basic_string<char,std::char_traits<char>,std::allocator<char> > &
    height_ 737 unsigned int
    width_  1395    unsigned int
    

The path to the file is 100% correct, so its not that it doesn't find it. I thought that the "error reading characters of string" note was particularly ominous but I can't understand what that pertains to.

  • Lastly, the stack from the error onwards reads:

    msvcr120d.dll!619114fa()    Unknown
    [Frames below may be incorrect and/or missing, no symbols loaded for msvcr120d.dll] 
    [External Code] 
    my_binary.exe!MyClass<float>::Read(const std::basic_string<char,std::char_traits<char>,std::allocator<char> > & file_name) Line 118 C++
    

These other SO questions (here, here, here, etc) suggest that this error is cause by a variety of reasons, however.

It shouldn't be a memory or space issue. This ran fine on a Linux machine with only 12GB RAM. Now it's on a supercharged Windows machine with over 200+GB of RAM. This compute has multiple TB of free hard drive space. I'm also reading/writing/executing from a directory that I have full permissions to.

I'm running it on Windows 8.1, compiled using Visual Studio 2013. I tried installing CDB and NTSD, but that "Invalid Path to Symbols" issue is a huge stumbling block for me.

Any help you can provide would be extremely appreciated.

Community
  • 1
  • 1
marcman
  • 3,233
  • 4
  • 36
  • 71
  • Can you surround the code with a `try...catch` and see if you are not catching a thrown exception? – NathanOliver Jun 13 '16 at 14:39
  • 1
    This really needs to be scaled down to a [mcve]. It's not likely that we are able to help with the information we have know (where we only half a small part of the program), other than good guessing. – MicroVirus Jun 13 '16 at 14:43
  • @MicroVirus: I hear you, and believe me I'd like to. Unfortunately I can't. Unless you have the first chunk of the program to produce the files necessary, and then you run it 84 times, and so on... you can't reproduce this. I was hoping that all this info and the function where the error is called would be enough for others who've had this type of problem – marcman Jun 13 '16 at 14:47
  • 1
    Well, the thing is that it seems to be a debug assertion failure that's happening in a part of `fstream`, but that's not very specific. The fact that it only happens after 83 or so files means it's not going to be as obvious to pinpoint by just looking at the function that works 'correctly' 83 times, unless something 'obvious' like stack corruption can be found. Is the memory usage of your program very high? Unless you're compiling for 64 bits, you only have ~4 GB at your disposal. – MicroVirus Jun 13 '16 at 14:52
  • One idea: what happens if you run it over the first 50 files twice, such that you reach 100 files processed with only actually the (contents/names of) first 50 files? – MicroVirus Jun 13 '16 at 14:56
  • @MicroVirus: It still errors out in that case. Can you go further into what you mean by "Unless you're compiling for 64 bits, you only have ~4 GB at your disposal"? Because I compiled this as 32-bit... – marcman Jun 13 '16 at 15:06
  • @marcman 32 bits programs have 32 bits address space, even if they are run under a 64 bits OS. I think the default for Visual Studio is to build for 32 bits. That means that your program, unless you compile it to target x64, can use only a tiny amount of the actual memory available, namely 4 GB in total, with all loaded DLLs and OS stuff also eating away at that space. I don't know the figures, but expect to cap at around 3-3.5 GB of memory for 32 bits programs run in 64 bits Windows. If you want to make use of the greater memory, you'll need to compile for x64. – MicroVirus Jun 13 '16 at 15:09
  • @MicroVirus: That very well could be the issue. That makes a ton of sense too. I will try to recompile to target x64, and hopefully that'll solve the issue. Thank you – marcman Jun 13 '16 at 15:12
  • @MicroVirus: To follow up, you were exactly right. A 64-bit target fixed the problem – marcman Jun 17 '16 at 18:23
  • @marcman Good to hear you fixed it :) – MicroVirus Jun 17 '16 at 18:49

0 Answers0