0

I am currently coding a search algorithm to find all of the cases of a specific byte (char) array found in the metadata of a video. I am trying to load all of the contents of a very large file (about 2GB) into an array, but I keep getting a bad_alloc() exception when I run the program because of the size of the file.

I think one solution would be to create a buffer in order to "chunk" the contents of the file, but I am not sure how to go about coding this.

So far, my code looks like this:

string file = "metadata.ts";
ifstream fl(file);  
fl.seekg(0, ios::end);  
size_t len = fl.tellg(); 
char *byteArray = new char[len];
fl.seekg(0, ios::beg); 
fl.read(byteArray, len); 
fl.close();

It works for smaller video files, but when I try a file that's slightly under 2GB, it crashes with a bad_alloc() exception.

Thanks in advance for any help - I'm open to all solutions.

EDIT: I have already checked out the other solutions on Stack OverFlow, and they are not exactly what I'm looking for. I am trying to "chunk" the data and use a buffer to put it into an array, which is not what the other solutions are doing.

Alex
  • 9
  • 1
  • Try it: https://stackoverflow.com/questions/34751873/how-to-read-huge-file-in-c – Andre Conjo Jun 21 '18 at 12:09
  • Why are you searching for it? I would write something like `std::find(std::istream_iterator(filestream), {}, wanted_byte);` – Incomputable Jun 21 '18 at 12:11
  • 1
    I don't think you need C++ for that at all. 'grep' command has one of the best search algorithms available **without** reading whole file (!): https://stackoverflow.com/questions/12629749/how-does-grep-run-so-fast – Michał Łoś Jun 21 '18 at 12:11
  • Possible duplicate of [How to read huge file in c++](https://stackoverflow.com/questions/34751873/how-to-read-huge-file-in-c) – rustyx Jun 21 '18 at 12:12
  • Search the internet for "c++ memory mapped I/O". – Thomas Matthews Jun 21 '18 at 14:11

0 Answers0