1

I am trying to search names in elastic search,

Consider name as kanal-kannan

normally we search name with * na, I tried to search like this -

"/index/party_details/_search?size=200&from=0&q=(first_name_v:kanal-*)"

this results in zero records.

Bubbles
  • 3,795
  • 1
  • 24
  • 25
Arun
  • 11
  • 1
  • I think I had the same problem and found a solution for my case: http://stackoverflow.com/questions/30917043/elasticsearch-searching-with-hyphens/30919206#30919206 – Roeland Van Heddegem Jun 18 '15 at 15:20

3 Answers3

2

Unless the hyphen character has been dealt with specifically by the analyzer then the two words in your example, kanal and kannan will be indexed separately because any non-alpha character is treated by default as a word delimiter.

Have a look at the documentation for Word Delimiter Token Filter and specifically at the type_table parameter.

Here's an example I used to ensure that an email field was correctly indexed

ft.custom_delimiter = {
    "type": "word_delimiter",
    "split_on_numerics": false,
    "type_table": ["@ => ALPHANUM", ". => ALPHANUM", "- => ALPHA", "_ => ALPHANUM"]
};
l4rd
  • 406
  • 3
  • 5
  • Thank you very much for ur response.. But still i am facing that problem.. pls give any other idea.. – Arun Mar 21 '14 at 04:20
0

- is a special character that needs to be escaped to be searched literally: \-

If you use the q parameter (that is, the query_string query), the rules of the Lucene Queryparser Syntax apply.

Depending on your analyzer chain, you might not have any - characters in your index, replacing them with a space in your query would work in that cases too.

knutwalker
  • 5,924
  • 2
  • 22
  • 29
0

@l4rd's answer should work properly (I have the same setup). Another option you have is to mark field with keyword analyzer to prevent tokenizing at all. Note that keyword tokenizer wouldn't lowercase anything, so use custom analyzer with keyword tokenizer in this case.

Victor Suzdalev
  • 2,202
  • 19
  • 28