I am reading UNICODE Howto in the Python documentation. It is written that
a Unicode string is a sequence of code points, which are numbers from 0 through 0x10FFFF
which make it looks like the maximum number of bits needed to represent a code point is 24 (because there are 6 hexadecimal characters, and 6*4=24).
But then the documentation states:
The first encoding you might think of is using 32-bit integers as the code unit
Why is that? The first encoding I could think of is with 24-bit integers, not 32-bit.