I am quite new to C programming and lately I am trying to fiddle around with some low-level I/O functions in C on my x86 Linux system just to get a better understanding of the internal processes. As an excercise I decided to write a little program that should read data from a file and produce a hexdump-like output on the console. My program reads the first 512 bytes of the file and passes the buffer to a function that produces the output.
#include <stdio.h>
#include <sys/stat.h>
#include <sys/types.h>
#include <stdlib.h>
#include <fcntl.h>
#include <string.h>
#include <unistd.h>
#define BYTES 512
void printout(unsigned char *data){
for (int i=0; i < BYTES; i++){
if (i == 0 || i % 16 == 0) printf("%07x ",i);
printf("%02x", data[i]);
if (i % 2)
printf(" ");
if ((i + 1) % 16 == 0)
printf("\n");
}
printf("%07x\n",BYTES);
return;
}
int main(int argc, char **argv){
unsigned char data[BYTES];
char *f;
int fd;
if(argv[1]==NULL){
fprintf(stderr, "Usage: %s <filename>\n", argv[0]);
return EXIT_FAILURE;
}
f = argv[1];
fd=open(f, O_RDONLY);
if(fd == -1){
fprintf(stderr, "Error: Could not open file %s\n", f);
return EXIT_FAILURE;
}
read(fd, data, BYTES);
printout(data);
close(fd);
return EXIT_SUCCESS;
}
Unfortunately, when I compare my output to hexdump or od, it seems that my program reverts the byte order.
$ myprogram /dev/sda |head -1
0000000 eb63 9000 0000 0000 0000 0000 0000 0000
$ hexdump /dev/sda |head -1
0000000 63eb 0090 0000 0000 0000 0000 0000 0000
As od -x --endian=little produces the same result as hexdump, I am certain that the problem is to be found in my code, I have no idea though. I would be grateful if anyone could explain to me why this happens and what I am missing respectively. I have searched the web but could not find anything helpful so far.
Thank you for your help!
Regards, Dirk