I am working on an assignment and it is asking me to calculate a checksum by stripping off the least significant byte of a ones complement version of an integer...
This is the part of the assignment outline I am confused by:
"The CHECKSUM field (MM) value is calculated by taking the least significant byte of the 1’s Complement value of the sum of the COUNT, ADDRESS and DATA fields of the record"
Im a little bit unclear on what this means, as I haven't really worked with ones complements or LSB's in C.
What I have so far is:
int checkSum(int count, int address, char* data)
{
int i = 0;
int dataTotal = 0;
for(i = 0; i < strlen(data); i += 2)
{
dataTotal += (getIntFromHex(data[i]) * 16) + getIntFromHex(data[i + 1]);
}
int checksum = ~(count + address + dataTotal) & 1;
printf("Checksum: %.2X\n", checksum);
return checksum;
}
I didn't really expect this to work but I've done some research and this is what I came up with.
I need some clarification on what is meant by the least significant byte.
P.S. The reason for the for loop is simply just to get the total of the data. Not important for this but the code uses the variable so I figured I would just copy the whole thing to avoid confusion.