I'm getting a java.lang.OutOfMemoryError:
when I try to download large files (>200MB) from the web application that i working on.
The flow to download is the next:
Main method:
public byte[] getFileBytes(@RequestBody ZeusRequestVO<String> request) {
return documentService.downloadFileByChunks(request).toByteArray();
}
Download logic:
public ByteArrayOutputStream downloadFileByChunks(String blobName) {
long file_size = 0;
long chunkSize = 10 * 1024 * 1024;
CloudBlockBlob blob = connectAndgetCloudBlockBlob(blobName);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
try {
if (blob.exists()) {
blob.downloadAttributes();
file_size = blob.getProperties().getLength();
for (long i = 0; i < file_size; i += chunkSize) {
blob.downloadRange(i, file_size, baos);
}
}
} catch (StorageException e) {
throw new GenericException(e, BusinesErrorEnum.AZURE_BLOB_STORAGE_EXCEPTION);
}
return baos;
}
I already add -Xms
and -Xmx
config in my app and that works while files not passed 200MB, in fact initially the web app wasn't capable to donwload files larges than 30MB until the -Xms
and -Xmx
configuration was added.
I see a solution here but i'm not able to update or add more libraries than existing (company policies).
Any advices?