1

I would like to upload large files (>1GB) to my amazon AWS S3 Bucket. Therefore I am using the TransferManager provided by AWS. I also specify the file length, as described in this post Java Heap Space is insufficient to upload files on AWS S3, but I am still facing a java heap space error. Have anyone an idea, what I am doing wrong in my code? Or what could be the problem in general? Thank you in advance!

public String uploadMultiPartFilePublicRead(MultipartFile multipartFile) throws IOException {
    String fileUrl = "";
    String fileName= "";
    File file = null;
    try {
        file = convertMultiPartToFile(multipartFile);
        fileName = generateFileName(multipartFile);
        fileUrl = endpointUrl + "/" + bucketName + "/" + fileName;
        // uploadFileTos3bucketPublicRead(fileName, file);
    } catch (Exception e) {
        e.printStackTrace();
    }

    if(!file.exists()) return "File does not exits";

    int maxUploadThreads = 5;

    TransferManager tm = TransferManagerBuilder
            .standard()
            .withS3Client(s3client)
            .withMultipartUploadThreshold((long) (5 * 1024 * 1024))
            .withExecutorFactory(() -> Executors.newFixedThreadPool(maxUploadThreads))
            .build();

    ObjectMetadata metadata = new ObjectMetadata();
    metadata.setContentLength(file.length());

    ProgressListener progressListener =
            progressEvent -> System.out.println("Transferred bytes: " + progressEvent.getBytesTransferred());
      
    PutObjectRequest request = new PutObjectRequest(this.bucketName, fileName, file)
    .withCannedAcl(CannedAccessControlList.PublicRead)
    .withMetadata(metadata);

    request.setGeneralProgressListener(progressListener);

    Upload upload = tm.upload(request);
    try {
        upload.waitForCompletion();
        System.out.println("Upload complete.");
    } catch (AmazonClientException e) {
        System.out.println("Error occurred while uploading file: AmazonClientExeception");
        e.printStackTrace();
    }
    catch(InterruptedException i)
    {
        System.out.println("Erorr occured while uploading file: InterruptedExecpetion");
        i.printStackTrace();
    }

    try{
    file.delete();
    }
    catch(Exception e)
    {
        System.out.println("Error occured while deleting file.");
        e.printStackTrace();
    }

    return fileUrl;
}
private File convertMultiPartToFile(MultipartFile file) throws IOException {
    File convFile = new File(file.getOriginalFilename());
    FileOutputStream fos = new FileOutputStream(convFile);
    fos.write(file.getBytes());
    fos.close();
    return convFile;
}
Carsten
  • 35
  • 7
  • What max heap size (the `-Xmx` option) are you using? Does increasing it help? – jarmod Jul 02 '20 at 10:04
  • I have the default settings there. I could increase it, you are right, but I have limited resources, so I would like to find another way. I heard of the method to convert the file in chunks? – Carsten Jul 02 '20 at 10:34
  • By chunks, do you mean multipart? TransferManager supports that by default. – jarmod Jul 02 '20 at 15:27
  • 1
    Show the actual exception, _with stack trace._ I'm betting that the actual problem is in your `convertMultiPartToFile()` method, which attempts to read the entire file into memory. – Parsifal Jul 02 '20 at 17:31
  • Yes you are right, the fos.write(file.getBytes()); was the mistake. I changed to InputStream with the transferTo method. Problem solved :) – Carsten Jul 02 '20 at 18:22

0 Answers0