The prototype of strlen()
is:
size_t strlen ( const char * );
It's return value type is size_t
, which in most cases, is an unsigned integer type (usually unsigned int
or unsigned long
.
When you do subtraction between two unsigned integers, it will underflow and wrap around if the result is lower than 0, the smallest unsigned integer. Therefore on a typical 32-bit system, 3U - 5U == 4294967294U
and on a typical 64-bit system, 3UL - 5UL == 18446744073709551614UL
. Your test of (strlen(s) - strlen(t)) > i
has exactly the same behavior of strlen(s) == strlen(t)
when i == 0
, as their length being identical is the only case that could render the test being false.
It's advised to avoid using subtraction when comparing intergers. If you really want to to that, addition is better:
strlen(s) > strlen(t) + i
This way it's less likely to have unsigned integer overflow.
By the way, if you save the length of the strings in variables, you can reduce an extra call to strlen()
. And since you do not modify the strings in your function, it is better to declare the function parameters as const char*
. It's also recommended that you do
const char *x ="abc";
const char *y ="defgh";
since string literals cannot be modified. Any attempt to modify a string literal invokes undefined behavior.