Here I'm working with a Java to C# sample app translation that involves cryptography (AES and RSA and so on...)
At some point in Java code (the one that actually works and being translated to C#), I've found this piece of code:
for (i = i; i < size; i++) {
encodedArr[j] = (byte) (data[i] & 0x00FF);
j++;
} // where data variable is a char[] and encodedArr is a byte[]
After some googling (here), I've seen that this is a common behaviour mainly on Java code...
I know that char
is a 16-bit type and byte
is 8-bit only, but I couldn't understand the reason for this bitwise and operation for a char->byte
conversion.
Could someone explain?
Thank you in advance.