0

I was playing around a little bit and noticed that the length of my input was longer than the allocated bits of my char*. I then choose to allocate even less memory and eventually set it to 0.

I have not tried, but I would assume that this is also the case for other stdio input functions.

int main(){
    FILE *file;

    file = fopen("../input.txt", "r");
    if(file == NULL){
        printf("Error with reading file");
        return 0;
    }

    int *bounds = (int*) malloc(0);
    char c, *token = (char*) malloc(0);

    while(fscanf(file, "%i-%i %[^\n]", bounds, bounds + 1, token) != EOF){
        printf("%i-%i %s\n", *bounds, *(bounds + 1), token);
    }
    
}

Is there any specific reason for this behavior, especially that it does not yield an error? It would be lovely to get some info on this. Thank you in advance!

YayL
  • 11
  • 5
  • 3
    This is [undefined behavior](https://stackoverflow.com/questions/2397984/undefined-unspecified-and-implementation-defined-behavior). The C standard does not require implementations to test this sort of thing or error out in any particular way. They may crash sometimes, or corrupt data other times, or appear to work just fine at still other times. – Nate Eldredge Dec 06 '21 at 22:15
  • In particular, do not think of segfault as the magical guaranteed result of every erroneous memory operation. It occurs when certain specific things happen at the machine level - usually when you access a *page* of memory that is not mapped for your process. If you access within a page that is mapped, no segfault, even if it wasn't memory that you wanted to be accessing. – Nate Eldredge Dec 06 '21 at 22:16
  • YayL, "fscanf pointer variable not giving sigsegv" --> are you expecting that code is _specified_ to give sigsegv? – chux - Reinstate Monica Dec 06 '21 at 22:35

0 Answers0