2

I have just downloaded and run Elasticsearch on my Windows 10 PC. I have not changed any settings, everything is default. The software is running and I can access it on port 9200. However, it appears to be consuming about 9 GB RAM.

I have nothing stored in it, nothing is indexed, it's a default setup.

Why does Elasticsearch need so much memory by default?


EDIT: It's the latest version, 8.1.2.

  • https://stackoverflow.com/questions/52421232/elastic-search-high-memory-consumption it looks like it depends on your JVM settings – Krzysztof Skowronek Apr 19 '22 at 08:02
  • On Windows, Java determines the maximum amount of memory a JVM process gets dynamically, depending on your overall amount RAM. How much RAM do you have overall? If you want to set a fixed maximum amount, you need to configure the `Xmx` parameter for the Elasticsearch service. – Tomalak Apr 19 '22 at 08:05
  • @KrzysztofSkowronek Thanks for that. But I have other Java applications running n the same machine and they never take up so much RAM. Why is that? – HibernatePro1337 Apr 19 '22 at 08:11
  • a) Not every Java application takes up all the RAM it could theoretically use. Databases such as Elasticsearch though will use as much RAM as possible, because they are faster with more memory cache. b) Not every Java application is started without any upper RAM limit set, and the dynamic configuration does only applies to the ones that are. You need to inspect their startup parameters (or config files) to find out the configuration. – Tomalak Apr 19 '22 at 08:29

1 Answers1

2

Elasticsearch latest version(8.1.2 in your case), comes with the bundled JDK and default settings, Elasticsearch default heap settings is 50% of RAM allocated to the machine, it looks like your machine RAM is ~20 Gig, if you want to change this settings, you can follow the steps given in the official jvm options document.

Note: You should allocate 50% of machine RAM to Elasticsearch heap if you running it in Production but it shouldn't cross the ~30 GB in any case.

Amit
  • 30,756
  • 6
  • 57
  • 88
  • "You should allocate 50% of machine RAM to Elasticsearch heap if you running it in Production" is not true. You definitely must not exceed 50%, but heaps smaller than 50% may perform better. The smaller the heap the more space there is for the filesystem cache, and the filesystem cache size is often critical for performance. – David Turner Apr 23 '22 at 16:56