7

I've set up a test Amazon Elasticsearch Service, which uses Elastic and Kibana 5.1.

I'm able to insert a test entry via curl:

curl -XPOST "https://mytestservicedomain.amazonaws.com/testindex/testtype" -d "{\"foo\":\"bar\"}"

And verify it was inserted via the Kibana's Dev Tools:

Request:

GET _search
{
   "query": {
       "match_all": {}
   }
}

Response:

{
    "_index": "testindex",
    "_type": "testtype",
    "_id": "AVoQD4Kyv413fK4nN1sg",
    "_score": 1,
    "_source": {
      "foo": "bar"
    }
}

But when I go to Discover in Kibana's menu options, I'm not able to get any results. All I get is a couple errors:

  • Saved "field" parameter is now invalid. Please select a new field.
  • Discover: "field" is a required parameter

enter image description here

I've found a couple posts (post 1, post 2) on Elastic's forums that seem to suggest there are some compatibility issues with Kibana/Elastic, but I just wanted to see if anybody else was running into it.

rtorres
  • 2,601
  • 3
  • 27
  • 33

4 Answers4

0

in my case: update and Migrate data from elk(2+versino) to elk(5.4). everything is fine, except "Discover: "field" is a required parameter"

then i delete index in Management/Kibana, readd it, still not working Imgur

finally, i found, i must del old index ".kibana" in ES, so i just

curl -XDELETE myesdomain.com:9200/.kibana

or use Dev ToolS DELETE /.kibana

Imgur

0

I have banged my head on the desk because of this issue. I don't know exactly what was the issue, but I can tell you what I did to solve it. I deleted the index pattern that was generating the error and recreated it with "Index contains time-based events" checked, I have given it the same name/regex as earlier (the same as the on i have deleted) and set "Time-field name" to a particular entry and then refreshed kibana. In my case, I had computer events stored in elastic and each of the documents had an "insert_date", which was the field that i chose as the "Time-field name". I don't know how much this will help you, but it worked for me, at least for a majority of index patterns. I still have a couple of index patterns that still generate the above error, despite deleting and recreating them. So my solution doesn't work in all cases and I am curious to find put what the problem is with them.

0

i was struck with this issue for past 1 week. i was so confused and was referring many git hub error page. accidentally i found the solution for this error. when we add index to kibana through management setting, we need to point to a particular field in "Time-field name refresh fields". this is the field you need to check carefully. you have to select which field is present in your document as mandatory field or the field that exists in all document. Once that is rightly mapped, then the error will be gone and you could see your index document. happy coding.

0

Check your current index mapping and find field that broke all. After, close that index (one ore more) and refresh index pattern in Kibana - corrupted filed has gone and pattern will work again. You can reopen index again, but do not refresh index patter wile this index (with corrupted filed exists).