If I execute parseInt('111AAA')
we get the output '111'
. In javascript why does parseInt work as its mentioned in the MDN docs.
The parseInt function converts its first argument to a string, parses it, and returns an integer or NaN. If not NaN, the returned value will be the decimal integer representation of the first argument taken as a number in the specified radix (base). For example, a radix of 10 indicates to convert from a decimal number, 8 octal, 16 hexadecimal, and so on. For radices above 10, the letters of the alphabet indicate numerals greater than 9. For example, for hexadecimal numbers (base 16), A through F are used.
If parseInt encounters a character that is not a numeral in the specified radix, it ignores it and all succeeding characters and returns the integer value parsed up to that point. parseInt truncates numbers to integer values. Leading and trailing spaces are allowed.
I've started to pick up JS and have a background working in Java and C. Shouldn't it be giving error if an alphanumeric value is supplied to it. What is the reasoning behind this behaviour ? Is there any other language that adopts this ?
Edit: Its been pointed out that similar functionality exists in C by @jabaa https://en.cppreference.com/w/cpp/string/basic_string/stol