1

I found this implementation for char * strchr (const char *string, int c);:

for (;;) 
  if (*string == c)
    return (char *) string;
  else if (*string == '\0')
    return NULL;
  else
    string++;

For me, though, it would be equivalent to do the following, which would be way easier to read:

while (*string != c && *string != '\0')
  string++;

return (*string == c) ? ((char *) string) : (NULL);

I take it there is some reason for the libc to implement the first one. But any take on what's the reason behind it?

Eduardo Bezerra
  • 1,923
  • 1
  • 17
  • 32
  • 1
    As a long-shot, it could be some way to make the code "easier" to optimize by localizing it a bit more, and re-using the same expression's result (`*string`) more closely to the last usage. – unwind Apr 10 '13 at 09:28
  • 1
    If you're really want to know, put those two codes in a test project in separate functions, and compare the generated assembly code at different optimization levels. It might turn out their version optimizes better? – Some programmer dude Apr 10 '13 at 09:29

1 Answers1

6

Yes, Visual Studio doesn't like while(1) when you bump up the warning level to the maximum (especially if you ask it to treat warnings as errors), but is OK with for(;;).

Alexey Frunze
  • 61,140
  • 12
  • 83
  • 180