I want to read a file 32 bytes at a time using a C/C++ program, but I want to be sure that the data will be 256 bits. In essence I am worried about leading bits in the "bytes" that I read from the file being off ? Is that even a matter of concern ? Example : If I have a number say 2 represented in binary as 10 . This would be sufficient for me as a human. How is that different as far a computer is concerned if it's written as: 00000010 to represent a char value of 1 byte ??? Would the leading zeros affect the bit count ? Does that in turn affect operations like XOR ? I've trouble understanding its effects ! Does that involve data loss ? I really do not know... !
Every help to clear my misunderstanding will be appreciated !!!