0

For a personal project I am working on, I seem to be getting an error when I subtract a long unsigned from an int.

Details:

I #define a value that will be used over multiple files and set it equal to 256. This is the max size of structs I make.

I then #define another variable that is the size of all items in a struct except a list. This number comes to 51.

In a struct I create, I use those #defined variables to create a list using the rest of the space in the "block." This should be 256 - 51 for the list size.

When I print the size of the struct as a whole, I get 264. When I print the arithmetic between the two #defined variables, I get that correct answer of 205 (256 - 51).

The size of the struct should be 256 and I do not know why it is not that size.

Any help will be appreciated!

Reproducible Code

test.c:

#include "test.h"

int main(){
   //Prints 205, which seems correct
   printf("OUTPUT: %lu\n", (BLOCKSIZE-META));

   //Prints 264, which is wrong 
   printf("OUTPUT: %d\n", (int) sizeof(Inode));
   return 0;
}

test.h

#include <stdio.h>
#include <time.h>

#define BLOCKSIZE 256
#define META (6 + 9 + (3 * (sizeof (int))) + (3 * (sizeof(time_t))))


typedef struct Inode {
   char a;
   char b;
   char c[9];
   int d;
   char e;
   char f;
   int g;
   int h;
   time_t i;
   time_t j;
   time_t k;
   char l;
   char m[BLOCKSIZE - META];
} Inode;

Output:

OUTPUT: 205
OUTPUT: 264
Jake
  • 89
  • 1
  • 7

0 Answers0