I have a simple C code which uses unsigned long long:
#include<stdlib.h>
unsigned long long get_random_id(const char *imeiId)
{
const unsigned long long MULT = 2862933555777941757LL;
const unsigned long long ADDEND = 3037000493LL;
unsigned long long newId, oldId;
oldId = atoll(imeiId);
newId = MULT * oldId + ADDEND;
return newId;
}
void main()
{
printf("%llu",get_random_id("351746051295833"));
}
I'm supposed to convert this to a java code, so I'm using BigInteger as follows:
public static void main(String args[]) {
System.out.println(get_random_id("351746051295833"));
}
static BigInteger get_random_id(String imeiId) {
final String MULT_STRING = "2862933555777941757";
final String ADDEND_STRING = "3037000493";
BigInteger MULT = new BigInteger(MULT_STRING);
BigInteger ADDEND = new BigInteger(ADDEND_STRING);
BigInteger oldId = new BigInteger(imeiId);
BigInteger temp = MULT.multiply(oldId);
BigInteger newId = temp.add(ADDEND);
return newId;
}
My problem here is that I'm not getting the same output for Java and C Code. For C code, I'm getting 10076018645131828514. while for Java code I'm getting 1007025573367229468539210487799074.
I'm not able to understand these different outputs for the same input.
PS: I'm running the code on Ubuntu 32 bit machine and using gcc compiler