0

this is not a duplicate question, i see this, i want to run a java prograrm and have this error:

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
    at edu.stanford.nlp.ie.crf.CRFLogConditionalObjectiveFunction.empty2D(CRFLogConditionalObjectiveFunction.java:892)
    at edu.stanford.nlp.ie.crf.CRFLogConditionalObjectiveFunction.<init>(CRFLogConditionalObjectiveFunction.java:134)
    at edu.stanford.nlp.ie.crf.CRFLogConditionalObjectiveFunction.<init>(CRFLogConditionalObjectiveFunction.java:117)
    at edu.stanford.nlp.ie.crf.CRFClassifier.getObjectiveFunction(CRFClassifier.java:1792)
    at edu.stanford.nlp.ie.crf.CRFClassifier.trainWeights(CRFClassifier.java:1798)
    at edu.stanford.nlp.ie.crf.CRFClassifier.train(CRFClassifier.java:1713)
    at edu.stanford.nlp.ie.AbstractSequenceClassifier.train(AbstractSequenceClassifier.java:763)
    at edu.stanford.nlp.ie.AbstractSequenceClassifier.train(AbstractSequenceClassifier.java:751)
    at edu.stanford.nlp.ie.crf.CRFClassifier.main(CRFClassifier.java:2917)

according this i try this:

java -Xms2000m -cp stanford-ner.jar edu.stanford.nlp.ie.crf.CRFClassifier -prop fa.prop 

but the error not fix and i see error again! when i was set a value more than 2000m, my os crashed, or i get this output:

...
...
//stanford log
...

Time to convert docs to data/labels: 8.8 seconds
Killed

how i can fix it

edit:

and for this

[stanford-ner]$ java -Xms1G -Xmx50G -cp stanford-ner.jar edu.stanford.nlp.ie.crf.CRFClassifier -prop fa.prop

i have this error:

[1000][2000][3000][4000][5000][6000]OpenJDK 64-Bit Server VM warning: INFO: os::commit_memory(0x00007f04c7c00000, 1225785344, 0) failed; error='Cannot allocate memory' (errno=12)
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Native memory allocation (malloc) failed to allocate 1225785344 bytes for committing reserved memory.
# An error report file with more information is saved as:
# /stanford-ner/hs_err_pid1536.log
Community
  • 1
  • 1
Vahid Kharazi
  • 5,723
  • 17
  • 60
  • 103
  • 2
    Is this *really* not a duplicate? I mean the answer in that question and the comments to it state perfectly clearly "Try increasing the heap size, maybe it just needs a bit more memory. If not, you likely have some kind of memory leak or inefficient operations that you need to get rid of". I can see the first part done in your question, but the second one is lacking – Ordous Jul 09 '14 at 11:47
  • If -Xmx2048m did not work for you, then try increasing it to -Xmx5G or -Xmx10G or -Xmx30G; else change your program to not hold on to so much memory in the first place. – Chris K Jul 09 '14 at 11:47
  • possible duplicate of [What is an OutOfMemoryError and how do I debug and fix it](http://stackoverflow.com/questions/24510188/what-is-an-outofmemoryerror-and-how-do-i-debug-and-fix-it) – Raedwald Jul 09 '14 at 11:51
  • @ChrisK, when i use any value more 2000m i see killed message. – Vahid Kharazi Jul 09 '14 at 11:51
  • this is not my program, it's for [stanford-nlp](http://www-nlp.stanford.edu/) ner – Vahid Kharazi Jul 09 '14 at 11:52
  • You get the "killed" message when there's not even enough storage available to run Java's error handling logic. This could be because you exceeded an OS process size limit or some such, or it could simply be because the app is multithreading and running seriously amok with it's allocations. – Hot Licks Jul 09 '14 at 11:54
  • 1
    @28, you can only make that value as large as the OS will allow; typically limited by how much physical memory + swap space you have. – Chris K Jul 09 '14 at 12:01

2 Answers2

2

Instead of trying with Xms option,

java -Xms2000m -cp stanford-ner.jar edu.stanford.nlp.ie.crf.CRFClassifier -prop fa.prop

try with Xmx as below,

java -Xmx2000m -cp stanford-ner.jar edu.stanford.nlp.ie.crf.CRFClassifier -prop fa.prop

Reference: Exception in thread "main" java.lang.OutOfMemoryError: Java heap space

Community
  • 1
  • 1
Vanaja Jayaraman
  • 753
  • 3
  • 18
1

Looking at the software's purpose it is likely that it is very memory-consuming, so it is reasonable to assume that 1GB of heap just isn't sufficient, so you'll have to further increase your heap-size.

The messages you get when you try imply that you are using

  • a 32-bit-OS or
  • a 32-bit-VM

which might both limit you to a maximum heap-size of about 1.5GB (at least on windows).

So make sure you use a 64bit-VM on a 64-bit-OS and then try again to increase the heap-size.

piet.t
  • 11,718
  • 21
  • 43
  • 52