0

We start indexing data from DB using solrj , we retrieve about 1 million record from the DB and then we make some process using java over the record . But we face some problem with the memory . When we start the indexing process the memory start growing to reach 7 G .the problem is that the indexing process finish but the memory dose not decrease its keep allocation for the 7 G.

user1025523
  • 55
  • 1
  • 9
  • Do you mean the memory that the JVM holds or did you have a look into the JVM - with tools like VisualVM (can be found in your JAVA_HOME/bin) - and you can see that the memory is still hold by solr? – cheffe May 22 '13 at 07:21
  • The memory is hold by the server that contain the solr war and the application that index the data war .also i did not use the VisualVM – user1025523 May 22 '13 at 07:40

2 Answers2

0

You should be able to find the answer to your problem, from the below links:

http://wiki.apache.org/solr/SolrPerformanceFactors

http://wiki.apache.org/solr/SolrCaching

http://www.lucidimagination.com/content/scaling-lucene-and-solr

Juned Ahsan
  • 67,789
  • 12
  • 98
  • 136
0

As you say that the JVM holds the memory, I think you have the problem that is elaborated in JVM sending back memory to OS

To tackle this you will need to do as Michael Borgwardt wrote there:

If you want the JVM to return memory to the OS more eagerly, you can use the tuning parameters of the Oracle JVM, specifically -XX:MaxHeapFreeRatio and -XX:MinHeapFreeRatio

Community
  • 1
  • 1
cheffe
  • 9,345
  • 2
  • 46
  • 57
  • but this option will work with jdk_JRockit or not ?also i read that is related also with the OS as we use Linux. – user1025523 May 23 '13 at 07:39
  • No, these commandline arguments will not do for jrockit. But as far as I read (http://docs.oracle.com/cd/E13188_01/jrockit/docs142/intro/newftrs.html#1000276) jrockit should free heap out of the box. So I would advise you to check if you have a memory leak in your solrj routine. – cheffe May 23 '13 at 10:03