I was asked this question during an interview with a famous IT company. They asked me to suggest how a character encoding will be implemented if we have lots of characters & 16 bits of Unicode are not enough. I answered we can implement 64 bit
encoding for characters. They said, even it's not enough, to which I suggested to implement a encoding via java BigInteger
.
Then they asked the encoding should be such that it only takes the bits that are needed. Like ASCII representation of A is 01000001
, we should not be using the leading 0
because we don't need it and we are wasting memory. I could not give an answer to it. If you could please tell me about how to approach this problem and how it is handled.