I am currently trying to run this equation with a ton of different inputs (x+y)^2 / xy
I have noticed an issue when i get x = 46340 and y = 1
It seems to output -46341 as seen here.
__global__ void proof() {
int x = 1;
int y = 1;
int multi_number = 100000;
bool logged = false;
while (true) {
long eq = ((x + y) * (x + y)) / (x * y);
if (x >= multi_number) {
x = 1;
y = y + 1;
}
if (eq < 4) {
if (logged == true) {
continue;
}
printf("\nGPU: Equation being used: (%d", x);
printf("+%d", y);
printf(")^2 / %d", x);
printf("*%d", y);
printf(" >= 4");
printf("\nGPU: Proof Failed: %d", x);
printf(", %d", y);
logged = true;
continue;
}
if (y >= multi_number) {
if (x >= multi_number) {
if (logged == true) {
continue;
}
printf("\nGPU: Proof is true for all cases.");
logged = true;
continue;
}
}
printf("\nGPU: Equation being used: (%d", x);
printf("+%d", y);
printf(")^2 / %d", x);
printf("*%d", y);
printf(" >= 4");
printf("\nGPU: %d", eq); // printing the equation
x = x + 1;
}
}
I have tried rewriting the equation and even putting the equation into a calculator. The calculator always gave a different response than the code is currently outputting, I have since double checked what I have put into the calculator and it remains the same.
I was expecting an output of 46342.