I have written a program for external sorting according to the book Programming Pearls, the biggest array is char all_nums[10,000,000];
and it needs 10M
stack memory (not really big). But this program don't run well (I use clang3.5 and gcc4.8 compile it and run it at Ubuntu 14.04), it get the segmentation fault (core dumped)
error. But when I decrease the size of array to char all_nums[1,000,000];
, it runs well.
The whole code is under here https://gist.github.com/xuefu/9aecc7f2b8ae3ab0ce55.
- OS can limits the running process's memory ? Is the same as of the stack memory and heap memory?
- How to get rid of this memory limits ?
- How to store the bits in C , like the class
bitset
in C++ ?
The main sort code is the function disk_sort()
:
void disk_sort()
{
char all_nums[MAX_SCOPE];
char buf[MAX_BUF];
char *ch;
FILE *fp;
int n, j;
fp = fopen(FILE_NAME, "r");
for(n = 0; n < MAX_SCOPE-1; ++n)
{
all_nums[n] = '0';
}
all_nums[MAX_SCOPE-1] = '\0';
while(fgets(buf, MAX_BUF, fp) != NULL)
{
sscanf(buf, "%d\n", &n);
all_nums[n]++;
}
fclose(fp);
fp = fopen(FILE_RESULT, "a");
n = 0;
while(all_nums[n] != '\0')
{
if(all_nums[n] != '0')
{
ch = itostr(n, &j);
ch[j++] = '\n';
ch[j] = '\0';
for(int i = 0; i < all_nums[n] - '0'; ++i)
{
fwrite(ch, sizeof(char), j, fp);
}
free(ch);
ch = NULL;
}
++n;
}
fclose(fp);
}