I am writing lots of strings out to a file and noticed that at some point, write operations starting to take more time to perform than before. Most of the strings are unique and are generated at run-time using a StringBuilder
, so I thought that was the issue, but turns out there's other reasons.
I wrote a quick program to see what's going on
public static void main(String[] args) {
long time, t1, t2;
int n = 10000;
int threshold = 10;
try {
BufferedWriter out = new BufferedWriter(new FileWriter("C:\\temp\\out.txt"));
for (int i = 0; i < n;i++) {
t1 = System.currentTimeMillis();
out.write("test\r\n"));
t2 = System.currentTimeMillis();
time = t2 - t1;
if (time > threshold) {
System.out.println(time);
}
}
out.close();
} catch(Exception e) {
e.printStackTrace();
}
}
I put in a threshold to filter out write operations that take minimal time. I set it to 10 milliseconds.
When n = 10 000
, nothing is printed out for me, which means the writes are fast. As I increase n
to 100 000
, 1 000 000
, 10 000 000
, a couple numbers are printed out. Then at 100 000 000
I start seeing lots of numbers being printed out. Increasing it to 1 000 000 000
, a lot of write operations are taking several tens to hundreds of milliseconds, which greatly slows down throughput.
There are likely many different reasons why this happens like me using a spinning disk drive or disk fragmentation. I've tried increasing the buffer size to 1 MB or 10 MB but it didn't seem to help (in fact it seemed to be making things worse).
Is there anything I can do to avoid this sudden drop in throughput overtime?