I have an input stream that is potentially 20-30mb. I'm trying to upload chunks as a multi-part file upload to S3.
I have the content-length available and I have the input-stream available. How can I efficiently do this with memory in mind.
I saw someone had done something like this, but not sure I fully understand it:
int contentLength = inputStreamMetadata.getContentLength();
int partSize = 512 * 1024; // Set part size to 2 MB
int filePosition = 0;
ByteArrayInputStream bais = inputStreamMetadata.getInputStream();
List<PartETag> partETags = new ArrayList<>();
byte[] chunkedFileBytes = new byte[partSize];
for (int i = 1; filePosition < contentLength; i++) {
// Because the last part could be less than 5 MB, adjust the part size as needed.
partSize = Math.min(partSize, (contentLength - filePosition));
filePosition += bais.read(chunkedFileBytes, filePosition, partSize);
// Create the request to upload a part.
UploadPartRequest uploadRequest = new UploadPartRequest()
.withBucketName(bucketName)
.withUploadId(uploadId)
.withKey(fileName)
.withPartNumber(i)
.withInputStream(new ByteArrayInputStream(chunkedFileBytes, 0, partSize))
.withPartSize(partSize);
UploadPartResult uploadResult = client.uploadPart(uploadRequest);
partETags.add(uploadResult.getPartETag());
}
}
Specifically this piece: .withInputStream(new ByteArrayInputStream(bytes, 0, bytesRead))