I'm messing around with the source code for an old Java game written in the early nineties. If I remember correctly, it's written for JDK 1.1.
Somewhere in the code, int primitives (in the range of 0 to about 120) are converted to chars. Here's an example:
char c = (char)(i+32);
This causes a problem for ints greater than 95. Here's the code and some of the output from a test case:
for(int i = 120; i >= 0; i--)
System.out.println(i + " -> " + (char)(i+32));
Output:
...
100 -> ?
99 -> ?
98 -> ?
97 -> ?
96 -> ?
95 ->
94 -> ~
93 -> }
92 -> |
91 -> {
90 -> z
89 -> y
88 -> x
87 -> w
...
3 -> #
2 -> "
1 -> !
0 ->
The integer value seems to be lost since the index goes past the bounds of normal character values.
This seems to be the root cause of a bug on the client-side portion of the game's UI. This encoded integer is sent back to the client, which then performs the inverse operation (subtracting 32 from the char and casting to get an int back).
It seems that the '?' is taken literally by the client-side processing module, as the bar is redundantly filled with the mapped integer value for '?' until the server starts sending back values smaller than 95.
- What character encoding might have been used on the authors' platform?
- What exactly is happening on my platform differently?
- What is, in your opinion, the easiest solution to this problem?