0

I have a Spring Integration flow which uploads the files to sftp server asynchronously, files to be uploaded are coming from http endpoint. Initially I faced same problem as discussed here glad it got solved.

In the same SO thread I found this comment.

In enterprise environments, you often have files of sizes you cannot afford to buffer into memory like that. Sadly enough, InputStreamResource won't work either. Your best bet, as far as I could tell so far, is to copy contents to an own temp file (e.g. File#createTempFile) which you can clean up at the end of the processing thread.

Currently I'm connecting file inputstream to InputStreamResource to get rid of the problem, its working flawlessly. why does the commenter say InputStreamResource won't work either, AFAIK InputStream never store data in memory

Does the InputStreamResource's inputStream gets closed automatically after file upload?

When we say large file how much of file size are we talking about here. currently in my case 2-5 Mb of files are uploaded to SFTP

Do I really need to care about changing my file upload mechanism to one something like storing in temp folder?

Code Sample:

@PostMapping("/upload")
public void sampleEndpoint(@NotEmpty @RequestParam MultipartFile file ) 
  throws IOException {
  Resource resource = new InputStreamResource(file.getInputStream());
  sftpFileService.upload(resource);
}

SftpFileService Async upload method:

@Async
public void upload(Resource resource){
  try{
     messagingGateway.upload(resource);
   }catch(Exception e){
      e.printStackTrace();
   }
}

1 Answers1

0

2-5 Mb is probably not a size to worry about. The problem could appear when files are in 1-2Gb size. Although you may face some out of memory when several concurrent uploads happens to your service.

The InputStreamResource is just a decorator around an InputStream with Resource API for access to the underlying delegating stream. It is not clear how it can work in async environment since MultipartFile is deleted in the end of HTTP upload request.

Plus you don't show any code to understand the situation better...

Artem Bilan
  • 113,505
  • 11
  • 91
  • 118
  • Suppose if there are 200 concurrent uploads happening at once that also means that there will be `fileSize multiplied by 200` memory used? BTW I have added code sample in my question. Thanks – abdur rehman Mar 02 '21 at 06:22
  • 1
    Doesn't look like your `messagingGateway.upload()` is an async call... So, that's probably why your `InputStreamResource` solution works. Otherwise I believe it is going to fail because HTTP layer deletes tmp files after getting control back in the end of request... – Artem Bilan Mar 03 '21 at 15:29
  • You are right. Thanks, I just realized that my async method is not working so `InputStreamResource` worked, However I will apply tmp file mechanism – abdur rehman Mar 04 '21 at 12:52