0

I have written a simple C code like this:

#include <stdio.h>

int main()
{
    long iteration = 0;
    
    while(1)
    {
        iteration += 10;
        
        if(iteration % 100000000 == 0)
        {
            printf("Iteration %ld\n", iteration / 1000000);
        }
    }

    return 0;
}

When the iteration reaches to 2100, the program stops (doesn't print anything). It seems that the iteration is considered as int by the compiler (GNU gcc), but I defined it as long. What is the problem?

2100 * 1000000 = 2100000000 (2.1 billions) and it is int overflow but my variable has been defined as long.

GSerg
  • 76,472
  • 17
  • 159
  • 346
abc def
  • 19
  • 4

0 Answers0