4

I have tried starting the process with 1,2,3,4 GB of memory but still get the same error. Any tips? The gc log shows that its running the GC even though it has enough memory. But probably this error has something to do with the memory mapped files using NIO. Anybody has seen anything like this before? And if so, how did you solve it?

$ java -d64 -server -Xmx15g -Xms15g -XX:+UseConcMarkSweepGC  -verbose:gc -XX:MaxPermSize=512m -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/home/frank/heap.dmp  -jar lukeall-3.5.0.jar 
    [Full GC 207648K->28758K(15660544K), 0.1105290 secs]
    [Full GC 61479K->15416K(15660544K), 0.0654310 secs]
    [Full GC 69950K->15418K(15660544K), 0.0717170 secs]
    [Full GC 69952K->15418K(15660544K), 0.0661720 secs]
    java.io.IOException: Map failed
        at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:849)
        at org.apache.lucene.store.MMapDirectory$MMapIndexInput.<init>(MMapDirectory.java:265)
        at org.apache.lucene.store.MMapDirectory.openInput(MMapDirectory.java:216)
        at org.apache.lucene.index.SegmentCoreReaders.<init>(SegmentCoreReaders.java:89)
        at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:115)
        at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:93)
        at org.apache.lucene.index.DirectoryReader.<init>(DirectoryReader.java:113)
        at org.apache.lucene.index.DirectoryReader$1.doBody(DirectoryReader.java:83)
        at org.apache.lucene.index.SegmentInfos$FindSegmentsFile.run(SegmentInfos.java:754)
        at org.apache.lucene.index.DirectoryReader.open(DirectoryReader.java:75)
        at org.apache.lucene.index.IndexReader.open(IndexReader.java:462)
        at org.apache.lucene.index.IndexReader.open(IndexReader.java:377)
        at org.getopt.luke.Luke.openIndex(Unknown Source)
        at org.getopt.luke.Luke.openOk(Unknown Source)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at thinlet.Thinlet.invokeImpl(Unknown Source)
        at thinlet.Thinlet.invoke(Unknown Source)
        at java.awt.Component.dispatchEventImpl(Component.java:4861)
    Caused by: java.lang.OutOfMemoryError: Map failed
        at sun.nio.ch.FileChannelImpl.map0(Native Method)
        at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:846)
        ... 48 more
user3111525
  • 5,013
  • 9
  • 39
  • 64

1 Answers1

14
ulimit -v unlimited

Solved the problem!

user3111525
  • 5,013
  • 9
  • 39
  • 64
  • 3
    +1 It worked for me too. Note to myself: **1** `-v The maximum amount of virtual memory available to the shell and, on some systems, to its children`. **2** `sun.nio.ch.FileChannelImpl.map` consumes memory _outside_ the java virtual machine (thus changing `-Xmx` does not solve the problem). The process is running out of memory not because the jvm cannot create objects, but because the jvm doesn't have more memory available from the system. **3** `limits only get effective after a session is restarted by init` – Alberto Jun 19 '12 at 15:38
  • Worked for me too, you saved my day. @Alberto thanks for the good explanation how this error can not be solved by staring at -Xmx values. – Harald Jul 03 '14 at 20:01
  • 1
    I had a similar problem, but ulimit -v unlimited was not the solution for me. I had to alter the max_map_count, see http://stackoverflow.com/questions/11683850/how-much-memory-could-vm-use-in-linux – Ben Ziegler Sep 10 '14 at 00:12
  • @BenZiegler I have a similar issue - `ulimit -v unlimited` does not work for me either. What exactly did you change about the max_map_count value? – laughing_man Jul 28 '15 at 20:29
  • I set my max_map_count to 10000000 – Ben Ziegler Jul 29 '15 at 21:30