0

I am trying to track down a memory leak in the Java code shown below.

  • This code throws OutOfMemoryError: heap space while running with a max heap size of 2 GB after handling only a few file updates.
  • The compressed size of files on S3 are no larger than a few hundred kB.
  • The input String ("amendment" variable) is no larger than 1 MB.
  • A heap dump shows a byte[] of hundreds of MB in size - in one case it was taking up over 50% of heap space. (Tooling has not pointed to where this array is being created, however).

It uses an AmazonS3 client instance to read and write objects in a remote S3 bucket, and also uses Jackson's ObjectMapper for working with Strings and converting them to JSON etc. The Deflater and Inflater classes from the java.util.zip package are used for compression/decompression.

Is there an obvious memory leak in this code? Is there an obvious bug which could lead to an ever growing/huge byte array on the heap? I've been really scratching my head over this one for a while! Thanks.

class FileManager {
    // Constructor omitted
    private static final int BYTE_SIZE = 1024;
    private final String bucketName;
    private final AmazonS3 client;
    private final ObjectMapper objectMapper;

    public void saveFile(String key, String amendment, String updateTimeStamp) throws Exception {
        S3Object object = null;
        try {
            JsonNode amendmentJson = objectMapper.readTree(amendment);
            ObjectNode targetJson;
            if (client.doesObjectExist(bucketName, key)) {
                object = client.getObject(bucketName, key);
                InputStream objectStream = object.getObjectContent();
                targetFile = (ObjectNode) decompress(objectStream);
                objectStream.close();
                object.close();
            } else {
                targetJson = objectMapper.createObjectNode();
            }
            targetJson.set(updateTimeStamp, amendmentJson);
            byte[] contentBytes = compress(objectMapper.writeValueAsBytes(targetJson));
            try (ByteArrayInputStream contentStream = new ByteArrayInputStream(contentBytes)) {
                client.putObject(bucketName, key, contentStream);
            }
        } catch (Exception ex) {
          // Logs and rethrows ...
        }
    }

    public byte[] compress(byte[] data) throws IOException {
        ByteArrayOutputStream outputStream = new ByteArrayOutputStream(data.length);
        Deflater deflater = new Deflater();
        byte[] buffer = new byte[BYTE_SIZE];
        deflater.setInput(data);
        deflater.finish();
        while (!deflater.finished()) {
            int count = deflater.deflate(buffer);
            outputStream.write(buffer, 0, count);
        }
        outputStream.close();
        deflater.end();
        return outputStream.toByteArray();
    }

    public JsonNode decompress(InputStream objectReader) throws IOException {
        byte[] data = IOUtils.toByteArray(objectReader);
        byte[] buffer = new byte[BYTE_SIZE];
        Inflater inflater = new Inflater();
        ByteArrayOutputStream outputStream = new ByteArrayOutputStream(data.length);
        String outputString;
        inflater.setInput(data);
        try {
            while (!inflater.finished()) {
                int count = inflater.inflate(buffer);
                outputStream.write(buffer, 0, count);
            }
            outputString = new String(outputStream.toByteArray());
        } catch (DataFormatException e) {
            log.info("Content not zipped");
            outputString = new String(data);
        }
        outputStream.close();
        inflater.end();
        StringReader reader = new StringReader(outputString);
        JsonNode jsonNode = objectMapper.readTree(reader);
        reader.close();
        return jsonNode;
    }

}
Boon
  • 1,073
  • 1
  • 16
  • 42

0 Answers0