-1

I am currently writing a Java application to retrieve BLOB type data from the database and I use a query to get all the data and put them in a List of Map<String, Object> where the columns are stored. When I need to use the data I iterate the list to get the information.

However I got an OutOfMemoryError when I tried to get the list of rows more than a couple times. Do I need to release the memory in the codes? My code is as follows:

ByteArrayInputStream binaryStream = null;
OutputStream out = null;
try {   
   List<Map<String, Object>> result =
    jdbcOperations.query(
        sql,
        new Object[] {id},
        new RowMapper(){
          public Object mapRow(ResultSet rs, int i) throws SQLException {
            DefaultLobHandler lobHandler = new DefaultLobHandler();
            Map<String, Object> results = new HashMap<String, Object>();
            String fileName = rs.getString(ORIGINAL_FILE_NAME);
            if (!StringUtils.isBlank(fileName)) {
              results.put(ORIGINAL_FILE_NAME, fileName);
            }
            byte[] blobBytes = lobHandler.getBlobAsBytes(rs, "AttachedFile"); 
            results.put(BLOB, blobBytes);

            int entityID = rs.getInt(ENTITY_ID); 
            results.put(ENTITY_ID, entityID);
            return results;
          }
        }
    );

  int count = 0;
  for (Iterator<Map<String, Object>> iterator = result.iterator(); 
      iterator.hasNext();) 
  {
    count++;
    Map<String, Object> row = iterator.next();
    byte[] attachment = (byte[])row.get(BLOB);
    final int entityID = (Integer)row.get(ENTITY_ID);
    if( attachment != null) {
      final String originalFilename = (String)row.get(ORIGINAL_FILE_NAME);
      String stripFilename;
      if (originalFilename.contains(":\\")) {
        stripFilename = StringUtils.substringAfter(originalFilename, ":\\");
      }
      else {
        stripFilename = originalFilename;
      }
      String filename = pathName + entityID + "\\"+ stripFilename;

      boolean exist = (new File(filename)).exists();

      iterator.remove(); // release the resource

      if (!exist) {
        binaryStream = new ByteArrayInputStream(attachment);
        InputStream extractedStream = null;
        try {
          extractedStream = decompress(binaryStream);
          final byte[] buf = IOUtils.toByteArray(extractedStream);
          out = FileUtils.openOutputStream(new File(filename));
          IOUtils.write(buf, out);
        }
        finally {
          IOUtils.closeQuietly(extractedStream);
        }
      }
      else {
        continue;
      }
    }
  }
}
catch (FileNotFoundException e) {
  e.printStackTrace();
}
catch (IOException e) {
  e.printStackTrace();
}
finally {
  IOUtils.closeQuietly(out);
  IOUtils.closeQuietly(binaryStream);
}
newguy
  • 5,668
  • 12
  • 55
  • 95

3 Answers3

1

You there are also command-line parameters you can use for tuning memory, for example:

-Xms128m -Xmx1024m -XX:MaxPermSize=256m

Here's a good link on using JConsole to monitor a Java application:

http://java.sun.com/developer/technicalArticles/J2SE/jconsole.html

paulsm4
  • 114,292
  • 17
  • 138
  • 190
1

Your Java Virtual Machine probably isn't using all the memory it could. You can configure it to get more from the OS (see How can I increase the JVM memory?). That would be a quick and easy fix. If you still run out of memory, look at your algorithm -- do you really need all those BLOBs in memory at once?

Community
  • 1
  • 1
Jim Ferrans
  • 30,582
  • 12
  • 56
  • 83
1

Consider reorganizing your code so that you don't keep all the blobs in memory at once. Instead of putting them all in a results map, output each one as you retrieve it.

The advice about expanding your memory settings is good also.

Ed Staub
  • 15,480
  • 3
  • 61
  • 91
  • This is also what I thought because I don't want my application needs to change JVM setting when it runs on another computer. But this way I might need to query the database many times and group the results very carefully to not to miss any row. – newguy Sep 15 '11 at 01:08
  • I've modified my code and the query so it performs the query based on a range of IDs and the OutOfMemory never happens again. – newguy Sep 15 '11 at 02:08
  • Maybe I'm missing something, but... I think you "just" need to replace the "put()" code with a call to a method that saves the blob to disk - the second part of your method as written. Splitting it off into a separate method will make the code easier to understand. Then your mapRow() function doesn't have to return anything significant - you won't use the return value. – Ed Staub Sep 15 '11 at 02:14