I hope this is not a duplicate, if it is I apologize.
Edit: this is a duplicate, in the sense that a similar question has been asked before. The answers to that question don't quite fit. To get a better sense of the subtlety I am trying to get at, read my comment about uninitialized memory. End of edit.
My question is about whether the C code below is in some sense legal. Even if it were, I wouldn't use it, since it generates a warning and warnings annoy me. Still, I'm curious. But enough of the rambling introduction, the question itself is rambling enough as it is. The setup is as follows.
Suppose you want to sort an array of integers. Instead of implementing your own sort, you decide to use qsort, which wants a comparison function which takes two void*. Inside the function, you cast the void* into int*, and then compare them.
In an ideal world, I'd like to pass a function that takes two int* instead. The compiler warns if you try this, for obvious reasons: for all it knows, qsort will try to call your function with a short* and a double*, and havoc will ensue. If the C type system were more expressive, then the declaration of qsort could give the compiler enough information to deduce that this will never happen.
But even with the type system we have, a Sufficiently Smart Compiler (tm) knows exactly how qsort works, and thus it knows that your comparison function will always get pointers to integers. Unfortunately, even then, the compiler must warn, because if the compiler is truly portable, it may have to support platforms where the size of a void* is not the same as the size of an int*.
But such platforms are very rare, bordering on non-existent. Far more common are platforms where an int is only 2 bytes, and on those platforms the expression 123*456 is undefined behavior. But most platforms today have a larger int, and on those the expression is well defined. If a compiler with 4 byte integers were to assume that 123*456 is unreachable, then that compiler is not standard compliant. In other words, once the implementation makes some promise (large int), the standard forbids it from doing things that are in general allowed (treating some expressions as if they overflowed).
Let's take another example. In most operating systems, dereferencing a null pointer will crash your program. But even if you don't mind crashing, you must still check the return value of malloc, because dereferencing null is undefined behavior according to the standard, and very few compilers make stronger guarantees in this case.
My question is thus: where does the function pointer cast fall on this continuum? I'm pretty sure that most platforms in common use today do in fact guarantee that all pointers have the same representation, are passed using the same calling convention, etc. If I only care about those platforms, can I then pass a comparison function that takes int* instead of void*, just like I can fearlessly multiply 3-digit numbers, or is it more like the situation with dereferencing null, where this particular behavior is undefined unless the documentation explicitly says otherwise?