I'm solving some algorithm test which is Collatz conjecture.
In short,
1-1. if the number is even, divide it by 2
1-2. if odd, multiply it by 3 and plus 1
2. repeat the same process 1(1-1 or 1-2), until the number become 1.
For example, 6 become 1, after 8 tries(6 → 3 → 10 → 5 → 16 → 8 → 4 → 2 → 1).
In the test, it should be ended in 500 tries and returns times of try. If it fails in 500 times, then returns -1.
Here's my code.
using System;
public class Program {
public int Main(int num) {
int answer = -1;
int maxTry = 500;
int count = 0;
if (num == 1)
return count;
for (count = 0; count < maxTry; count++)
{
// 1-1
if (num % 2 == 0)
{
num /= 2;
}
// 1-2
else
{
num = num * 3 + 1;
}
if (num == 1)
{
answer = count + 1;
break;
}
}
Console.Write(answer);
return answer;
}
}
It was working quite well, before meet '626331'!. In explanation, 626331 can't be 1 in 500 times. But with my code, it return 488, which means it becomes 1 at 488 tries. When I printed process repeat by repeat, it looked working well.
After all attempts, found out that dividing was the problem.
I changed this
if (num % 2 == 0)
...
else
...
into
if (num % 2 == 0)
...
else if (num % 2 == 1)
...
Now every case works perfectly! But I don't have any clue for this situation.
It was online coding test and compile option was C# Mono C# Compiler 5.14.0.177