I need to return a rather large file from a web request. The file is around 670mb in size. For the most part this will work fine but after some time the following error will be thrown:
java.lang.OutOfMemoryError: Direct buffer memory
at java.nio.Bits.reserveMemory(Bits.java:694) ~[na:1.8.0_162]
at java.nio.DirectByteBuffer.<init>(DirectByteBuffer.java:123) ~[na:1.8.0_162]
at java.nio.ByteBuffer.allocateDirect(ByteBuffer.java:311) ~[na:1.8.0_162]
at sun.nio.ch.Util.getTemporaryDirectBuffer(Util.java:241) ~[na:1.8.0_162]
at sun.nio.ch.IOUtil.read(IOUtil.java:195) ~[na:1.8.0_162]
at sun.nio.ch.FileChannelImpl.read(FileChannelImpl.java:159) ~[na:1.8.0_162]
at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:65) ~[na:1.8.0_162]
at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:109) ~[na:1.8.0_162]
at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:103) ~[na:1.8.0_162]
at java.nio.file.Files.read(Files.java:3105) ~[na:1.8.0_162]
at java.nio.file.Files.readAllBytes(Files.java:3158) ~[na:1.8.0_162]
I have set the heap size to 4096mb which I think should be large enough to handle this kinds of files. Furthermore when this error occured I took a heapdump with jmap to analyze the current state. I found two rather large byte[], which should be the file I want to return. But the heap is only around 1.6gb in size and not near the configured 4gb it can be.
According to some other answer (https://stackoverflow.com/a/39984276/5126654) in a similar question I tried running manual gc before returning this file. The problem still occured but now only spardic. The problem occured after some time, but then when I tired running the same request again it seems like the garbage collection took care of whatever caused the problem, but this is not sufficient since the problem apparently still can occur. Is there some other way to avoid this memory problem?