Differentiate between the character code representation of a decimal digit and its pure binary representation
I study computer science and this is a concept I need to know for the exams but I am not sure I fully understand it
Character code representation of 261 (for example)
Would this just be the ASCII code equivalent?
Meaning:
2 has ASCII code 50
6 has ASCII code 54
1 has ASCII code 49
So the character code representation is 50, 54, 49
Pure Binary code representation
Is this just the binary conversion of 261?
So 100000101?