There is no such thing as "implicit typecast", typecast refers to explicitly changing the type of an operand. The proper term is implicit (type) conversion.
The fundamentals of C state that the compiler is free to order around or optimize your program as it pleases, as long as it doesn't change the outcome of the program.
So even if the compiler spots that a division by 1 doesn't make sense and can get optimized away, it must still take the potential side-effects caused by the implicit type conversion in account: it cannot optimize those away, because the programmer might have been intentionally relying on them.
In your specific case, signed int div=(val/count)
would enforce val
to get implicitly converted to unsigned type. But it doesn't really matter, as you show the results back into a signed type and anything divided by 1 will remain unchanged anyhow. So the compiler can therefore optimize the whole thing away as the result would have been the same no matter if unsigned or signed arithmetic was used.
If you divide by 5 though, the results turn very different between -20/5 = -4
and 0xFFFFFFEC/5 = 0xFFFFFFFC
. So then the compiler is not allowed to optimize away the implict conversion, as it affects the result.
Therefore the programmer has to know the implicit type conversion rules to tell what will actually happen between the lines of their own source code.