I started out just running some tests to see the speed difference between writing to a file and printing to the console, and how much of a difference there was between SSD and HDD. My program just prints the numbers 0-10,000,000
Console: 6.089
file: 4.269
I also ran this test up to a hundred million and consistently saw the same ratios of times. I also checked changing the order of the tests and saw no change in speed.
Here's where it gets weird. I changed both printlns to .println(i*i+42/7*9-89*2%400/2);
after doing this I got
Console: 8.586
file: 4.475
Where the console time increased significantly, but the file time did not. As a final oddity I changed it to .println( ( i*i+42/7*9-89*2 ) %400/2)
and in this case I actually saw a speed up in console output.
Console: 4.352
file: 4.66
Can anyone explain these oddities? I can't seem to find any reason for the drastic speed changes. I'm thinking perhaps it's just a change in the number of bits that have to be written, but I cannot explain why it only effects the console's speed.
Any help or answers are very much appreciated! This problem has been bothering me for a while so I thought I would ask the experts!