So I have a text file containing about 666 000 lines of maximum 10 numbers/line separated by a space. Example:
8 38 62 39 4 50 86 43
53 78 38 22 39 29 78 5
24 13 58 92
.......
53 78 38 22 39 29 78 5
Given a sequence of n numbers, I have to verify if a line has all its elements from the sequence.
I have tried something like this:
int check()
{
int nrelem = 0, nr, line_int[11];
int found_counter = 0, sw = 0;
char *p;
f = fopen("temp.txt", "rt");
while (!feof(f))
{
nrelem = 0; found_counter = 0; sw = 0;
fgets(line, 256, f);
p = strtok(line, " ");
while (p != NULL)
{
sscanf(p, "%d", &line_int[nrelem++]);
p = strtok(NULL, " ");
}
for (int i = 0; i < n; i++)
{
for (int j = 0; j < nrelem; j++)
{
if (seq[i] == line_int[j])
{
sw = 1;
break;
}
}
if (sw)
found_counter++;
}
if (found_counter == nrelem)
return 0;
}
fclose(f);
return 1;
}
The problem is that the running time for this function, at 600 000 lines / file is about 14 seconds. I guess it is the way I get my elements from each line of the file with strtok and the implementation with files. Do you guys know a better approach to this, that can reduce the running time below 1 second without the need of a quantum computer? :D Thank you in advance.