0

at the moment I am using char *lines[1000+1] to read a file's line and was wondering if there is another more efficient way where I can read files as small as 1 line or as big as 5000 without having to hard code it.

Here is the code I am using:

int tokenize(char *result[], char *data, char *delimter) {
    int i = 0;
    char *token = strtok(data, delimter);

    while(token != NULL) {
        result[i++] = token;

        token = strtok(NULL, delimter);
    }

    return i;
}

char *filebuffer = NULL;
char *lines[1000+1];

/* Read the file */
filebuffer = readfile(argv[1]);
if(filebuffer == NULL) {
    printf("Failed to read file.\n");
    return EXIT_FAILURE;
}

/* Split file into lines */
linecount = tokenize(lines, filebuffer, "\n");
Johnny Doey
  • 93
  • 1
  • 8
  • Would using linked lists work in this case? I don't know if that would be smart... – Johnny Doey Jan 24 '15 at 12:06
  • 1
    Read into a small buffer using [`fgets`](http://en.cppreference.com/w/c/io/fgets), and as long as the last character in the buffer is not a newline continue reading while dynamically [reallocating](http://en.cppreference.com/w/c/memory/realloc) the buffer. – Some programmer dude Jan 24 '15 at 12:06
  • 1
    possible duplicate of [C dynamically growing array](http://stackoverflow.com/questions/3536153/c-dynamically-growing-array). You might especially look [this answer](http://stackoverflow.com/a/3536261/) that feature a 20 line basic implementation of a dynamically growing array. – FabienAndre Jan 24 '15 at 12:09

0 Answers0