1

I working with standford core nlp packages. Which give set of jar files and execution unit. I could compile and run few test example.

There is one sample java example. I compiled is successfully with:

H:\Drive E\Stanford\stanfor-corenlp-full-2013~>javac -cp stanford-corenlp-3.3.0.
jar;stanford-corenlp-3.3.0-javadoc.jar;stanford-corenlp-3.3.0-models.jar;stanfor
d-corenlp-3.3.0-sources.jar; StanfordCoreNlpDemo.java

While I ran it:

H:\Drive E\Stanford\stanfor-corenlp-full-2013~>java -cp stanford-corenlp-3.3.0.
jar;stanford-corenlp-3.3.0-javadoc.jar;stanford-corenlp-3.3.0-models.jar;stanfor
d-corenlp-3.3.0-sources.jar; StanfordCoreNlpDemo

It gave exceptions:

Searching for resource: StanfordCoreNLP.properties
Searching for resource: edu/stanford/nlp/pipeline/StanfordCoreNLP.properties
Adding annotator tokenize
Adding annotator ssplit
Adding annotator pos
Reading POS tagger model from edu/stanford/nlp/models/pos-tagger/english-left3wo
rds/english-left3words-distsim.tagger ... done [8.7 sec].
Adding annotator lemma
Adding annotator ner
Loading classifier from edu/stanford/nlp/models/ner/english.all.3class.distsim.c
rf.ser.gz ... Exception in thread "main" java.lang.OutOfMemoryError: Java heap s
pace
        at java.io.ObjectInputStream$HandleTable.grow(ObjectInputStream.java:344

How can I allocate memory in command line to remove above exception and execute it?

I could compile these two successfully.

java -cp "*" -mx1g edu.stanford.nlp.sentiment.SentimentPipeline -file input.txt

and

java -cp stanford-corenlp-3.3.0.jar;stanford-corenlp-3.3.0-models.jar;xom.jar;joda-time.jar -Xmx600m edu.stanford.nlp.pipeline.StanfordCoreNLP -annotators tokenize,ssplit,pos,lemma,parse -file input.txt
user2256866
  • 129
  • 4
user123
  • 5,269
  • 16
  • 73
  • 121

2 Answers2

2

You can add this to your command line -Xmx1024m which will give 1GB of ram to your application, But I would rather advice you to use a java heap profiller like the one embedded in Netbeans see here to find out what's the real problem.
You can also refer to that post which has a much more complete explanation.
What are Runtime.getRuntime().totalMemory() and freeMemory()?

Community
  • 1
  • 1
Kiwy
  • 340
  • 2
  • 10
  • 43
  • I already used netbeans, set memory `Xmx500,600,800,1000` but never succeed. Sometime no response for15-20 minutes, or exception for memory out of heap. googled a lot to any how get executed the example, but no luck – user123 Dec 04 '13 at 08:54
  • if you do face a `OutOfMemory` exception, it means that the example has a major memory leaks somewhere. try to use the memory profiler intgrated with netbeans to see what object are using so much RAM https://profiler.netbeans.org/ – Kiwy Dec 04 '13 at 10:27
  • Give it **real** memory. `java -Xmx4g ...` – Ingo Dec 04 '13 at 10:42
0

I believe I am too late to answer your question, but this will definitely save somebody else's time that I took to figure out the execution of StanfordCoreNlpDemo.java file.

H:\Drive E\Stanford\stanfor-corenlp-full-2013~>java -cp stanford-corenlp-3.3.0.
jar;stanford-corenlp-3.3.0-javadoc.jar;stanford-corenlp-3.3.0-models.jar;stanfor
d-corenlp-3.3.0-sources.jar; -Xmx1200m StanfordCoreNlpDemo

For the Octoboer 2014 version of stanford-core-nlp, ';'(semicolon) should be replaced by ':'(colon) It should be noted that there is a space needed in the last ';' and the -Xmx option in the above java command.

user2256866
  • 129
  • 4