0

I know there is post on loading index into Ram for lucene.

Faster search in Lucene - Is there a way to keep the whole index in RAM?

But I really need it for Solr, to improve search speed. Any pointers would be helpful :) thanks

Community
  • 1
  • 1
Aditya
  • 128
  • 1
  • 13

1 Answers1

1

I don't think this is a good idea. It's like asking to improve speed in Windows by disabling its swapfile. Solr implements some very efficient caching mechanisms on top of Lucene, and there's also the file system cache.

If you have speed issues with Solr this is not the solution. Please post another question detailing your problems and let us recommend you a proper solution.

See also: XY Problem

Community
  • 1
  • 1
Mauricio Scheffer
  • 98,863
  • 23
  • 192
  • 275
  • I have 36 million entries log file each with 5 small fields. I want to search them, as traitional DB's fail at indexing and searching them so I thought of Solr and successfully indexed the data. And now I am trying to search it. total index file size is 5gb. Any how I observed that if I allocate more than index file size to jvm. caching doesn't take place( I can say that because previously with less memory allocated to jvm, It used to take 5-6 secs for every nth query. I assume it because there is 0% hit rate in cache and it needs to be loaded ). – Aditya Apr 17 '11 at 20:34
  • @Aditya: please post all those details in a new question. If Solr isn't caching, either it's misconfigured or the queries are wrong. – Mauricio Scheffer Apr 17 '11 at 21:44