2

We have a java scheduled task to run queries in a java program and send the results to Jersey based restful webservice. While this seems to be working correctly for some cases for one case it does not work and ends up with Java.lang.OutOfMemoryError: Java heap space

In following program data is streamed in chunks but for one of the query data is not being flushed more then once. Because of this all the chunks are accumulating in memory causing out of memory exception.

We have already seen Java - Upload OutputStream as HTTP File Upload HttpURLConnection timeout question and Handling large records in a Java EE application

Many of them suggest using HTTPClient but we want to avoid using it if possible.

 for (int i = 0; i < arr.length(); i++) {
            try {
                JSONObject obj = arr.getJSONObject(i);
                StringBuffer query = new StringBuffer((String) obj.get("query")); // Getting one query from list of queries
                int size =500;
                int count = 0;
                URL url = new URL("http://urltomyserver.com");
                java.security.Security.addProvider(new com.sun.net.ssl.internal.ssl.Provider());
                System.setProperty("java.protocol.handler.pkgs", "com.sun.net.ssl.internal.www.protocol");
                httpcon = (HttpURLConnection) url.openConnection();
                httpcon.setChunkedStreamingMode(1024 * 1024);
                httpcon.setAllowUserInteraction(false);
                httpcon.setDoOutput(true);
                httpcon.setRequestProperty("Content-type", "application/x-www-form-urlencoded");
                httpcon.setRequestMethod("POST");
                OutputStream rawOutStream = httpcon.getOutputStream();
                BufferedWriter w = new BufferedWriter(new OutputStreamWriter(rawOutStream));
                do {
                    Query q = new Query(query + " limit " + count + ", " + size);
                    count += size;
                    Map[] results = q.runQuery();
                    JSONArray resultArr = new JSONArray(results);
                    w.write(resultArr.toString(1));
                    w.flush();
                } while (results != null && results.length > 0);
                InputStream inputStream = httpcon.getInputStream();
                w.close();
            } catch (Exception ex) {
                EcwLog.AppendExceptionToLog(ex);
            }
        }

Definition of server side receiving function is following

public String saveResultsMP(@QueryParam("apuId") final String apuId, @QueryParam("tableName") String tableName, @QueryParam("queryId") String query_id, InputStream uploadedInputStream) {
Community
  • 1
  • 1
Rishabh
  • 338
  • 1
  • 3
  • 17
  • How often does your loop run? Does this also happen if you declare `q`, `results` and `resultArr` outside your loop? For performance reasons you should do this anyway. – André Stannek Jan 04 '13 at 11:22
  • Well the for loop runs 4 to 5 times successfully. After that it keeps looping in do-while because one of the queries has over 100,000 rows but it breaks before it can call InputStream inputStream = httpcon.getInputStream(); – Rishabh Jan 04 '13 at 12:05
  • That's what I thought. I'm not familiar with the class `Query`. Maybe it has to be closed? Also, as I said, please try to declare your variables outside the do-while loop. I'm not sure if your objects are dereferenced correctly while the loop is running. As a last option you can always [increase the memory size](http://stackoverflow.com/questions/2294268/how-can-i-increase-the-jvm-memory) for the JVM. – André Stannek Jan 04 '13 at 15:31
  • 1M is a huge chunk size. I would reduce that to 64k or so: there's no real benefit in making it any bigger. – user207421 Jan 04 '13 at 18:43

0 Answers0