Wondering how to convert the output of arbitrarily sized integers like 1
or 12345
or 5324617851000199519157
to an array of integers.
[1] // for the first one
// [probably just a few values for the second 12345...]
[1, 123, 255, 32, ...] // not sure here...
I am not sure what the resulting value would look like or how to compute it, but somehow it would be something like:
A bunch of 8-bit numbers that can be used to reconstruct (somehow) the original arbitrary integer. I am not sure what calculations would be required to do this either. But all I do know is that each unique arbitrarily-sized integer should result in a unique array of 8-bit values. That is, no two different date integers should result in the same array.
It doesn't matter the language much how this is implemented, but probably an imperative language like JavaScript or C.
I am pretty sure the arrays should all be the same length as well, but if that's not possible then knowing how to do it a different way would be okay.