So I'm guessing I'm missing something fairly simple here but I am trying to read a file line by line, tokenizing the buffer as I go. I have pasted the basics of what I'm trying to do with my code. I have never had issue with strtok, so I'm guessing it has to do with the buffer that I'm using. Any nudges in the right direction? I read that strtok isn't a great option, but it's the only thing I'm familiar with (I suppose I could write my own function) It reads the first token as it's supposed to every time. It doesn't seg fault until I try to find the second token with "strtok(NULL," ");"
I don't know why this was downvoted as a duplicate. Yes, there are answers out there that tell the basics of what I'm trying to do, but I want to understand the problem, not just cut and paste." I'd prefer to know WHY there is a seg fault and why my code is behaving as it does. No need to downvote when I'm asking specific questions not pointed out directly in other posts.
const char *file = "path/to/file/file.txt";
void tokenize();
//Eventually file will be command line opt
FILE *open_file(const char *file);
int main(int argc, char *argv[])
{
tokenize();
}
void tokenize()
{
FILE *fp;
fp = open_file(file);
char buffer[BUFSIZ];
while(fgets(buffer,BUFSIZ,fp) != NULL)
{
//puts("========================================");
//puts(buffer);
//puts("========================================");
char *data = strdup(buffer);
char *token;
token = strtok(data, " ");
//puts(token);
while(token != NULL)
{
token = strtok(NULL, " ");
puts("++++++++++++++++++++++++++++++++++++++++++++++");
puts(token);
puts("++++++++++++++++++++++++++++++++++++++++++++++");
}
}
fclose(fp)
}
FILE *open_file(const char *file)
{
FILE *fp;
fp = fopen(file, "r");
if(fp == NULL)
{
perror("Error opening file");
}
return fp;
}