10

When importing items into my Rails app I keep getting the above error being raised by SearchKick on behalf of Elasticsearch.

I'm running Elasticsearch in a Docker. I start my app by running docker-compose up. I've tried running the command recommended above but i just get "No such file or directory" returned. Any ideas?

I do have port 9200 exposed to outside but nothing seems to help. Any ideas?

rctneil
  • 7,016
  • 10
  • 40
  • 83
  • Are you getting disk watermark related errors/warnings as well? – Nishant Jan 04 '19 at 02:55
  • @NishantSaini Not that I am aware of. SearchKick is only returning the error I posted. – rctneil Jan 04 '19 at 09:34
  • @NishantSaini . Any ideas? I'm desperate to get this fixed. I've seen a fix online that says to run `curl -XPUT -H "Content-Type: application/json" > http://localhost:9200/_all/_settings -d '{"index.blocks.read_only_allow_delete": null}'` . I have tried this by running it locally, in my app container and my elasticsearch container but none of them work. They all just return "http://localhost:9200/_all/_settings: No such file or directory" . Any ideas? – rctneil Jan 04 '19 at 22:22
  • 1
    Remove the `>` symbol from the command. That is causing the error. Use the command as `curl -XPUT -H "Content-Type: application/json" http://localhost:9200/_all/_settings -d '{"index.blocks.read_only_allow_delete": null}'` – Nishant Jan 05 '19 at 02:12
  • Also I will suggest you to look at the logs of elastic node. The above command will fix the error but the exact cause of why index is in read-only mode should be found. – Nishant Jan 05 '19 at 02:15
  • @NishantSaini Ok, I've run that new command and get "curl: (52) Empty reply from server". When running `docker-compose up` I can see this in the output: "elasticsearch_1 | [2019-01-05T11:06:36,027][WARN ][o.e.c.r.a.DiskThresholdMonitor] [y3m9dza] flood stage disk watermark [95%] exceeded on [y3m9dza-TfS2SrCOUg8sBg][y3m9dza][/usr/share/elasticsearch/data/nodes/0] free: 1.2gb[2.1%], all indices on this node will be marked read-only". I've just tried importing data into my Rails app but it still hits that problem. Any other ideas? – rctneil Jan 05 '19 at 11:11
  • @NishantSaini I did run the command purely from my command line, is that correct? Should I be running it with "docker-compose run"? If so then inside whcih of the containers? – rctneil Jan 05 '19 at 11:13
  • Ok so as I was suspecting this to be a disk issue. It is the one – Nishant Jan 05 '19 at 11:38
  • Make sure your elastic container has enough disk available at least 15% should be free – Nishant Jan 05 '19 at 11:40
  • Let us [continue this discussion in chat](https://chat.stackoverflow.com/rooms/186224/discussion-between-rctneil-and-nishant-saini). – rctneil Jan 05 '19 at 11:57

2 Answers2

13

Indeed, running curl -XPUT -H "Content-Type: application/json" http://localhost:9200/_all/_settings -d '{"index.blocks.read_only_allow_delete": null}' as suggested by @Nishant Saini resolves the very similar issue I ran just into.

I hit disk watermarks limits on my machine.

kghbln
  • 858
  • 2
  • 8
  • 18
  • i tried this but i got curl: (7) Failed to connect to localhost port 9200: Connection refused --- curl: (3) [globbing] unmatched close brace/bracket in column 5 – ZINE Mahmoud May 15 '21 at 17:21
10

Use the following command in linux:

 curl -s -H 'Content-Type: application/json' -XPUT 'http://localhost:9200/_all/_settings?pretty' -d ' {
    "index":{
             "blocks" : {"read_only_allow_delete":"false"}
    }
}'

the same command in Kibana's DEV TOOL format :

PUT _all/_settings
{
    "index":{
             "blocks" : {"read_only_allow_delete":"false"}
    }
}
hamid bayat
  • 2,029
  • 11
  • 20