When I process a large volume of data during encryption its heap size becomes full and throws java.lang.OutOfMemoryError
I don't want to increase the java heap size manually in my container
Here is my java code
public String encrypt(String data) {
try {
IvParameterSpec iv = new IvParameterSpec(initVector.getBytes("UTF-8"));
SecretKeySpec skeySpec = new SecretKeySpec(key.getBytes("UTF-8"), "AES");
Cipher cipher = Cipher.getInstance("AES/CBC/PKCS5PADDING");
cipher.init(Cipher.ENCRYPT_MODE, skeySpec, iv);
byte[] encrypted = cipher.doFinal(data.getBytes());
return Base64.encodeBase64String(encrypted);
} catch (Exception ex) {
ex.printStackTrace();
}
return null;
}
I also heard to use cipher.update first then use cipher.doFinal but it won't work for me because i am dealing with only single block of data
What do I do so that my heap size does not affect?
I saw similar issue Java Out of Memory Error during Encryption but it won't resolve my problem
I also don't want to specify the maximum amount of bytes to proceed at time