I have a web application that users use to send messages.
The problem is that the number of characters in a message determine the cost of sending the message.
I have noticed that the javascript UI code counts the characters just fine but the DBMS's in built functions sometimes return a higher number of character.
Here is an example of a string that exhibit's this anomalous behaviour:
String with different lengths..
This string has different lengths depending on the programming language use to count the characters.
Transact SQL LEN() and MySQL LENGTH() return 217.
Python len() returns 212.
The standard string length functions in Javascript and Python return similar values but lower than the values return by Transact-SQL's LEN() and DATALENGTH() and MySQL's LENGTH() (which also return values similar to each other).
So why the different values?