I am having trouble getting the decimal values of UTF-8 characters and then converting them to binary (something like 12 = 0b110). For example, how can I transfer "ン"
to its binary "11100011 10000011 10110011"
?
I know that UTF-8 uses multiple bytes. I tried to print it out every 8 bits from left to right. For ASCII, I use the way below to print it out, but for UTF-8, what can I use?
char asc[10];
while ((c = getchar()) != EOF)
{
int a = c;
asc = DecimalToBinary(a);
for (i = 7; i >= 0; i--)
{
printf("%c",*(asc + i));
}
}
char *DecimalToBinary (int num) {
static char binary[] = {'0', '0','0', '0','0', '0','0', '0'};
int i = 0;
while (num != 0) {
if (num % 2 == 0)
{
binary[i++] = '0';
}
else {
binary[i++] = '1';
}
num = num / 2;
}
return binary;
}