I'm trying to compress files inside a directory using java FileSystem. It's working fine when there are only few files, but it fails when there are more than 100 files.
This is the code which I used in my program:
Map<String, String> env = new HashMap<>();
env.put("create", "true");
URI uri = URI.create("jar:file://10.0.8.31/Shared/testFile.zip");
long bytesRead = 0;
File dir = new File("D:\\Shared\\DPXSequence");
try (FileSystem zipfs = FileSystems.newFileSystem(uri, env)) {
for (File sourceF : dir.listFiles()) {
Path externalFile = Paths.get(sourceF.getAbsolutePath());
Path pathInZipfile = zipfs.getPath("/" + sourceF.getName());
// copy a file into the zip file
Files.copy(externalFile, pathInZipfile, StandardCopyOption.REPLACE_EXISTING);
}
}
catch(Exception e) {
System.out.println("Error : "+e.toString());
}
This is the error which I'm getting:
Exception in thread "AWT-EventQueue-0" java.lang.OutOfMemoryError: Java heap space
Where I'm doing wrong?
I think Files.copy()
execution completing before it actually compressed and copied to the destination folder. Is that causing the issue?