while performing wildcard search on elastic documents with large input text(more than 1000 characters)
I'm getting this exception for "starts with" search
my starts with search eg: "myinputtext..*" myinputtext = more than 1000 characters
detailed exception: Suppressed: org.elasticsearch.client.ResponseException: method [POST], host [http://localhost:9200], URI [/index_133/_search?pre_filter_shard_size=128&typed_keys=true&max_concurrent_shard_requests=5&ignore_unavailable=false&expand_wildcards=open&allow_no_indices=true&ignore_throttled=true&search_type=dfs_query_then_fetch&batched_reduce_size=512&ccs_minimize_roundtrips=true], status line [HTTP/1.1 400 Bad Request] {"error":{"root_cause":[{"type":"query_shard_exception","reason":"failed to create query: input automaton is too large: 1001"}}}]},"status":400}
my request sample is:
{
"query": {
"bool": {
"filter": [
{
"term": {
"indexId": {
"value": 133,
"boost": 1
}
}
},
{
"nested": {
"query": {
"bool": {
"must": [
{
"wildcard": {
"inputTextDetails.searchInput.input.keyword": {
"wildcard": "myinputtext sample*",
"boost": 1
}
}
}
],
"adjust_pure_negative": true,
"boost": 1
}
},
"path": "inputTextDetails",
"ignore_unmapped": false,
"score_mode": "none",
"boost": 1
}
}
],
"adjust_pure_negative": true,
"boost": 1
}
}
}
and my mappings.json is:
{
"dynamic_templates": [
{
"with_keyword_custom_normalizer": {
"match_mapping_type": "string",
"mapping": {
"type": "text",
"analyzer": "simple",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 4000,
"normalizer": "lower_case_normalizer"
}
}
},
"path_match": "inputTextDetails.searchInput.*"
}
}
]
}
Please suggest me your thoughts on searching for large inputs on large data fields. thanks for your help.