I have a java program that sends a series of GET requests to a webservice and stores the response body as a text file.
I have implemented the following example code (filtered much of the code to highlight the concerned) which appends the text file and writes as a new line at the EOF. The code, however, works perfectly but the performances suffers as the size of the file grows bigger.
The total size of data is almost 4 GB and appends about 500 KB to 1 MB of data on avg.
do
{
//send the GET request & fetch data as string
String resultData = HTTP.GET <uri>;
// buffered writer to create a file
BufferedWriter writer = new BufferedWriter(new FileWriter(path, true));
//write or append the file
writer.write(resultData + "\n");
}
while(resultData.exists());
These files are created on daily basis and moved to hdfs for hadoop consumption and as a real-time archive. Is there a better way to achieve this?