4
#include <stdio.h>
#include <math.h>

// #define LIMIT 600851475143
int isP(long i);
void run();
// 6857
int main()
{
    //int i = 6857;
    //printf("%d\n", isP(i));
     run();
}

void run()
{
long LIMIT = 600851475143;
// 3, 5
// under 1000

long i, largest =1, temp=0;
for(i=3; i<=775147; i+=2)
{
    temp = ((LIMIT/i)*i);
if(LIMIT == temp)
    if(isP(i)==1)
            largest = i;
}
printf("%d\n",largest);
}

int isP(long i)
{
long j;
for(j=3; j<= i/2; j+=2)
    if(i == (i/j)*j)
        return 0;

return 1;

}

I just met an interesting issue. As above shows, this piece of code is designed to calculate the largest prime number of LIMIT. The program as shows above gave me an answer of 29, which is incorrect.

While, miraculously, when I defined the LIMIT value (instead of declaring it as long), it could give me the correct value: 6857.

Could someone help me to figure out the reason? Thanks a lot!

user2085606
  • 71
  • 1
  • 1
  • 3
  • 1
    Works fine here? Clang 4.0. Might be because my `long` is 64 bit. – James M Mar 06 '13 at 17:09
  • @JamesMcLaughlin Yeah - on many platforms, `long` is a 32bit signed integer, and will fail, but that's platform/compiler specific. – Reed Copsey Mar 06 '13 at 17:12
  • you need to put #define LIMIT 600851475143L see this answer http://stackoverflow.com/questions/1458923/long-long-in-c-c – Dampsquid Mar 06 '13 at 17:13
  • C will automatically promote an integer constant to the size required when used in an expression (unless you're explicit about the type). You still can't assign that to `long`, if it won't fit. – teppic Mar 06 '13 at 17:14
  • Thx for all of you above! – user2085606 Mar 08 '13 at 14:57

6 Answers6

3

A long on many platforms is a 4 byte integer, and will overflow at 2,147,483,647. For example, see Visual C++'s Data Type Ranges page.

When you use a #define, the compiler is free to choose a more appropriate type which can hold your very large number. This can cause it to behave correctly, and give you the answer you expect.

In general, however, I would recommend being explicit about the data type, and choosing a data type that will represent the number correctly without requiring compiler and platform specific behavior, if possible.

Reed Copsey
  • 554,122
  • 78
  • 1,158
  • 1,373
  • Thanks a lot for all of you. I used to use Java, where long is totally ok... I neglected such details and ran into many impasses these days.. I should have been more careful. Again, thx. – user2085606 Mar 08 '13 at 14:56
1

I would suspect a numeric type issue here.

#define is a preprocessor directive, so it would replace LIMIT with that number in the code before running the compiler. This leaves the door open for the compiler to interpret that number how it wants, which may not be as a long.

In your case, long probably isn't big enough, so the compiler chooses something else when you use #define. For consistent behavior, you should specify a type that you know has an appropriate range and not rely on the compiler to guess correctly.

You should also turn on full warnings on your compiler, it might be able to detect this sort of problem for you.

WildCrustacean
  • 5,896
  • 2
  • 31
  • 42
1

When you enter an expression like:

(600851475143 + 1)

everything is fine, as the compiler automatically promotes both of those constants to an appropriate type (like long long in your case) large enough to perform the calculation. You can do as many expressions as you want in this way. But when you write:

long n = 600851475143;

the compiler tries to assign a long long (or whatever the constant is implicitly converted to) to a long, which results in a problem in your case. Your compiler should warn you about this, gcc for example says:

warning: overflow in implicit constant conversion [-Woverflow]

Of course, if a long is big enough to hold that value, there's no problem, since the constant will be a type either the same size as long or smaller.

teppic
  • 8,039
  • 2
  • 24
  • 37
0

It is probably because 600851475143 is larger than LONG_MAX (2147483647 according to this)

Mike D
  • 4,938
  • 6
  • 43
  • 99
0

Try replacing the type of limit with 'long long'. The way it stands, it wraps around (try printing limit as a long). The preprocessor knows that and uses the right type.

Rad'Val
  • 8,895
  • 9
  • 62
  • 92
0

Your code basically reduce to these two possibilities:

long LIMIT = 600851475143;
x = LIMIT / i;

vs.

#define LIMIT 600851475143
x = LIMIT / i;

The first one is equivalent to casting the constant into long:

x = (long)600851475143 / i;

while the second one will be precompiled into:

x = 600851475143 / i;

And here lies the difference: 600851475143 is too big for your compiler's long type so if it is cast into long it overflows and goes crazy. But if it is used directly in the division, the compiler knows that it doesn't fit into a long, automatically interprets it as a long long literal, the i is promoted and the division is done as a long long.

Note, however, that even it the algorithm seems to work most of the time, you still have overflows elsewhere and so the code is incorrect. You should declare any variable that may hold these big values as long long.

rodrigo
  • 94,151
  • 12
  • 143
  • 190