1

I am using fscrawler to create an index of data above 7TB. The indexing starts fine but then stops when the index size gets to 2.6gb. I believe this is a memory issue, how do I configure the memory?

My machine memory is 40GB and I have assigned 12GB to elasticsearch.

enter image description here

dadoonet
  • 14,109
  • 3
  • 42
  • 49
Denn
  • 447
  • 1
  • 6
  • 27
  • please edit your post and replace image with content of your yaml file. This way people will be able to copy-paste text and use it for further search and so on. – Maxim Sagaydachny Dec 09 '19 at 18:15
  • this might help with your issue https://www.slideshare.net/AndrewClegg1/scaling-elasticsearch-for-multiterabyte-analytics – bilpor Dec 10 '19 at 07:51

1 Answers1

1

You might have to assign as well enough memory to FSCrawler using FS_JAVA_OPTS. Like:

FS_JAVA_OPTS="-Xmx4g -Xms4g" bin/fscrawler
dadoonet
  • 14,109
  • 3
  • 42
  • 49