0

Wondering how to convert the output of arbitrarily sized integers like 1 or 12345 or 5324617851000199519157 to an array of integers.

[1] // for the first one
// [probably just a few values for the second 12345...]
[1, 123, 255, 32, ...] // not sure here...

I am not sure what the resulting value would look like or how to compute it, but somehow it would be something like:

A bunch of 8-bit numbers that can be used to reconstruct (somehow) the original arbitrary integer. I am not sure what calculations would be required to do this either. But all I do know is that each unique arbitrarily-sized integer should result in a unique array of 8-bit values. That is, no two different date integers should result in the same array.

It doesn't matter the language much how this is implemented, but probably an imperative language like JavaScript or C.

I am pretty sure the arrays should all be the same length as well, but if that's not possible then knowing how to do it a different way would be okay.

RobG
  • 142,382
  • 31
  • 172
  • 209
Lance
  • 75,200
  • 93
  • 289
  • 503
  • Why not just get the components like the year, month, day, etc.? – Barmar Nov 20 '18 at 22:35
  • Why not using timestamp with time() and localtime() to get the time as well as its components? – Tom Kuschel Nov 20 '18 at 22:37
  • Sorry the date made it confusing, the date part is irrelevant. – Lance Nov 20 '18 at 22:37
  • What are you actually tryng to accomplish? You could easily split an n-bit integer into n/8 8 bit intgers, and reconstitute the original... but if you were actually looking for a more efficient way of encapsulating time, that would be pointless. – Blunt Jackson Nov 20 '18 at 22:38
  • Nothing to do with date/datetime, sorry about that. Updated the question. I want to convert any integer of arbitrary size to array of 8-bit integers. – Lance Nov 20 '18 at 22:39

2 Answers2

1

Most languages, including C and Javascript, have bit-shifting and bit-masking operations as part of their basic math operations. But beware Javascript: numbers are 64 bits, but only 32-bit masking operations are allowed. So:

let bignum = Date.now();
let hi = Math.floor(bignum / 0x100000000),
    lo = bignum & 0xFFFFFFFF,
    bytes = [
        (hi >> 24) & 0xFF,
        (hi >> 16) & 0xFF,
        (hi >> 8) & 0xFF,
        hi & 0xFF,
        (lo >> 24) & 0xFF,
        (lo >> 16) & 0xFF,
        (lo >> 8) & 0xFF,
        lo & 0xFF
    ];
JO3-W3B-D3V
  • 2,124
  • 11
  • 30
Lee Daniel Crocker
  • 12,927
  • 3
  • 29
  • 55
  • If the integer is a string and it's a "bigint" of arbitrary size I don't think this would work. I would like for it to handle that, not sure how you knew to have 8 items in the `bytes` array, and that probably wouldn't work for arbitrarily sized small/bigints ranging from `0` to `'47171857151875817758571875815815782572758275672576575677'` or whatever arbitrary size. – Lance Nov 20 '18 at 22:47
  • This may seem relatively straightforward to you, but I don't understand where `Math.floor(bignum / 0x100000000)` comes from? – zfrisch Nov 20 '18 at 23:09
  • @zfrisch That's shifting the high 32 bits into the low 32 bits. Javascript >> 32 will just produce a 0. – Lee Daniel Crocker Nov 20 '18 at 23:48
1

I'm not sure if this is too brute-forcey for what you want, but you can take an arbitrary string and just do the long division into a unit8Array.

Here's a function (borrowed liberally from here) that will convert back and forth from an arbitrarily long string:

function eightBit(str){ 
    let dec = [...str],  sum = []
    while(dec.length){
        let s = 1 * dec.shift()
        for(let i = 0; s || i < sum.length; i++){
            s += (sum[i] || 0) * 10
            sum[i] = s % 256
            s = (s - sum[i]) / 256
        }
    } 
    return Uint8Array.from(sum.reverse())
}


function eightBit2String(arr){ 
    var dec = [...arr], sum = []
    while(dec.length){
        let s = 1 * dec.shift()
        for(let i = 0; s || i < sum.length; i++){
            s += (sum[i] || 0) * 256
            sum[i] = s % 10
            s = (s - sum[i]) / 10
        }
    }
    return sum.reverse().join('')
}

// sanity check
console.log("256 = ", eightBit('256'), "258 = ", eightBit('258')) 
 
let n = '47171857151875817758571875815815782572758275672576575677'
let a = eightBit(n)
console.log("to convert:", n)
console.log("converted:", a.toString())
let s = eightBit2String(a)
console.log("converted back:", s)

No doubt, there are some efficiencies to be found (maybe you can avoid the interim arrays).

Mark
  • 90,562
  • 7
  • 108
  • 148
  • Wondering if the output will be unique for every unique input. That is, there will be no collisions. – Lance Feb 13 '19 at 20:35
  • this fails if there are zeroes in the string, any idea how to fix? Like on 1024. – Lance Aug 19 '21 at 23:22
  • @LancePollard when I run `eightBit('1024')` it produces `[4, 0]` as expected. Converting back `eightBit2String([4, 0])` gives `1024` also expected. Is my expectation is wrong (this is a pretty old question--haven't thought about this in a while)? – Mark Aug 20 '21 at 00:08
  • Oh ok I think you are right, I was processing it backwards, alright good to know, thanks! – Lance Aug 20 '21 at 00:21