I have a Spring Integration flow
which uploads the files to sftp server asynchronously, files to be uploaded are coming from http endpoint. Initially I faced same problem as discussed here glad it got solved.
In the same SO thread I found this comment.
In enterprise environments, you often have files of sizes you cannot afford to buffer into memory like that. Sadly enough, InputStreamResource won't work either. Your best bet, as far as I could tell so far, is to copy contents to an own temp file (e.g. File#createTempFile) which you can clean up at the end of the processing thread.
Currently I'm connecting file inputstream to InputStreamResource
to get rid of the problem, its working flawlessly. why does the commenter say InputStreamResource won't work either, AFAIK InputStream
never store data in memory
Does the InputStreamResource
's inputStream gets closed automatically after file upload?
When we say large file how much of file size are we talking about here. currently in my case 2-5 Mb of files are uploaded to SFTP
Do I really need to care about changing my file upload mechanism to one something like storing in temp folder?
Code Sample:
@PostMapping("/upload")
public void sampleEndpoint(@NotEmpty @RequestParam MultipartFile file )
throws IOException {
Resource resource = new InputStreamResource(file.getInputStream());
sftpFileService.upload(resource);
}
SftpFileService Async upload method:
@Async
public void upload(Resource resource){
try{
messagingGateway.upload(resource);
}catch(Exception e){
e.printStackTrace();
}
}