I need to implement a 1024bit math operations in C .I Implemented a simple BigInteger library where the integer is stored as an array "typedef INT UINT1024[400]" where each element represent one digit. It turned up to be so slow so i decided to implement the BigInteger using a 1024bit array of UINT64: "typedef UINT64 UINT1024[16]"
so for example the number : 1000 is represented as {0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1000}, 18446744073709551615 as {0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0xFFFFFFFFFFFFFFFF} and 18446744073709551616 as {0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0}.
I started wih writing the function to convert a char array number to an UINT1024 and an UINT1024 to a char array, it worked with numbers <= 0xFFFFFFFFFFFFFFFF. Here's what i did:
void UINT1024_FROMSTRING(UIN1024 Integer,const char szInteger[],UINT Length) {
int c = 15;
UINT64 Result = 0,Operation,Carry = 0;
UINT64 Temp = 1;
while(Length--)
{
Operation = (szInteger[Length] - '0') * Temp;
Result += Operation + Carry;
/*Overflow ?*/
if (Result < Operation || Temp == 1000000000000000000)
{
Carry = Result - Operation;
Result = 0;
Integer[c--] = 0;
Temp = 1;
}
else Carry = 0;
Temp *= 10;
}
if (Result || Carry)
{
/* I DONT KNOW WHAT TO DO HERE ! */
}
while(c--) Integer[c] = 0;}
So please how can i implement it and is it possible to implement it using UINT64 for speed or just to stick with each array element is a digit of the number which is very slow for 1024bit operations.
PS: I can't use any existing library !
Thanks in advance !
Update Still can't figure out how to do the multiplication. I am using this function:
void _uint128_mul(UINT64 u,UINT64 v,UINT64 * ui64Hi,UINT64 * ui64Lo)
{
UINT64 ulo, uhi, vlo, vhi, k, t;
UINT64 wlo, whi, wt;
uhi = u >> 32;
ulo = u & 0xFFFFFFFF;
vhi = v >> 32;
vlo = v & 0xFFFFFFFF;
t = ulo*vlo; wlo = t & 0xFFFFFFFF;
k = t >> 32;
t = uhi*vlo + k;
whi = t & 0xFFFFFFFF;
wt = t >> 32;
t = ulo*vhi + whi;
k = t >> 32;
*ui64Lo = (t << 32) + wlo;
*ui64Hi = uhi*vhi + wt + k;
}
Then
void multiply(uint1024_t dUInteger,uint1024_t UInteger)
{
int i = 16;
UINT64 lo,hi,Carry = 0;
while(i--)
{
_uint128_mul(dUInteger[i],UInteger[15],&hi,&lo);
dUInteger[i] = lo + Carry;
Carry = hi;
}
}
I really need some help in this and Thanks in advance !