We have a web application where users can download a list of data by clicking a link. What that does is fires a stored proc in a MS sql server db - fetches rows with 14 columns and all Strings. Once we get that and extract from the resultset we directly stream it down to a csv file on teh client's machine. That way we are saving creating intermediate domain objects(equal to number of rows retuned) in memory before starting the streaming operation. Also we donot wait until the whole resultset has been loaded in memory. However say for clients who has e..g 80000 instances of such data - it is still spiking up the memory by 50 mb - then there is a fall of around 20 mb and it remains at that level for quite sometime. If I do a Perform GC on jconsole it frees the remaining 30 mb as well. Not sure what is causing that to linger for sometime. Also the 50 mb spike is unnacceptable for a application running on 1.2 gig memory. for bigger clients it shots up by 400 mb and freezes the application or OOM happens. Any suggestion how we can achieve this? PLEASE note - I have implemented teh same thing in another place and there is downloads a file of same size but different data(6 columns) in 5 secs and with a memory spike of only 5 mb. In this case it took the stored proc to run in only 4 secs though when run on Sql Mnagament studio. But the one for which i am getting a huge spike the query itself takes 45 secs to run and more based on data as it passes it through a lot of validation. Can that have an adverse effect? I was hoping not as we are fetching chunks of 1000 on the setFetchSize() on the preparedststement
here is the snippet of code
Connection connection = null;
PreparedStatement statement = null;
ResultSet rs = null;
OutputStream outputStream = null;
BufferedWriter bufferedWriter = null;
try
{
response.setContentType("application/save");
response.setHeader("Content-Disposition", "attachment; filename="
+ link.getFileName());
outputStream = response.getOutputStream();
bufferedWriter = new BufferedWriter(new OutputStreamWriter(
outputStream));
connection = dataSource.getConnection();
statement = connection.prepareStatement(link.getQuery());
statement.setFetchSize(1000);
statement.setInt(1, form.getSelectedClientId());
rs = statement.executeQuery();
while (rs.next())
{
bufferedWriter
.write(getCsvRowString(new String[]
{ rs.getString(1), rs.getString(2), rs.getString(3),
rs.getString(4), rs.getString(5),
rs.getString(6), rs.getString(7),
rs.getString(8), rs.getString(9),
rs.getString(10), rs.getString(11),
rs.getString(12), rs.getString(13),
rs.getString(14), rs.getString(15),
rs.getString(16), rs.getString(17),
rs.getString(18) }));
}
} catch (final Exception e)
{
log.error("Error in downloading extracts " + e.getMessage());
throw e;
} finally
{
if (bufferedWriter != null)
{
bufferedWriter.flush();
bufferedWriter.close();
}
if (outputStream != null)
{
outputStream.close();
}
rs.close();
statement.close();
connection.close();
}