I have to write a program that reads a huge file with with operator>>, count things and then insert the whole in multimap.
It works perfectly with small files but with big files the program runs out of memory. It throws a bad_alloc on Windows 32-bit.
I made a 64-bit build on Windows (in order to avoid the memory allocation limit for 32-bit programs on Windows), opened the task manager and what I see is that the memory is increasing continually, and when the RAM is full the program crashes. Do you know how to deal with this ?
Here is the code :
#include <iostream>
#include <string>
#include <fstream>
#include <unordered_map>
using namespace std;
int main()
{
ifstream fin("C:/text.txt");
string const outfile("C:/out/out.txt");
ofstream fout (outfile.c_str());
string word;
unordered_multimap<string, int> mymm;
int intnum = 0;
while(fin >> word)
{
mymm.insert(pair<string, int>(word, intnum));
++intnum;
}
return 0;
}