I'm trying to write a class in C# that converts a big number (string format) to any number system, I found this code at How to convert a gi-normous integer (in string format) to hex format? (C#)
var s = "843370923007003347112437570992242323";
var result = new List<byte>();
result.Add( 0 );
foreach ( char c in s )
{
int val = (int)( c - '0' );
for ( int i = 0 ; i < result.Count ; i++ )
{
int digit = result[i] * 10 + val;
result[i] = (byte)( digit & 0x0F );
val = digit >> 4;
}
if ( val != 0 )
result.Add( (byte)val );
}
var hex = "";
foreach ( byte b in result )
hex = "0123456789ABCDEF"[ b ] + hex;
This code also works for any numeric system (2^n base) with a few modifications to the code.
The problem is that I do not understand the logic of the algorithm (the for statement). Can someone please explain this part of the code:
for ( int i = 0 ; i < result.Count ; i++ )
{
int digit = result[i] * 10 + val;
result[i] = (byte)( digit & 0x0F );
val = digit >> 4;
}
if ( val != 0 )
result.Add( (byte)val );
In order to make this code to convert, for example, a string decimal to a string base64, I need to change the mask so it can calculate six bits, instead of just four for the hex system, and then right-shift digit by 6 to add the remaining to the next byte.
result[i] = (byte)( digit & 0x03F );
val = digit >> 6; // 2^6 = 64
and finally just change the look-up table to print the result
hex =
"ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/" [ b ] + hex;
It is this line in the for-loop the one I don't totally understand
int digit = result[i] * 10 + val;
What is this line, along with the loop, doing on each iteration to every byte of result? and most importantly, why?