0

I am getting the following exception while trying bulk api in Elasticsearch using java :

Caused by: java.lang.IllegalArgumentException:
Document contains atleast one immense term in field="msg_properties" (whose UTF8 encoding is longer than the max length 32766), all of which were skipped. Please correct the analyzer to not produce such terms. The prefix of the first immense term is: '[7b 4a 4d 53 43 6f 72 72 65 6c 61 74 69 6f 6e 49 44 3d 6e 75 6c 6c 2c 20 4a 4d 53 4d 65 73]...'

I searched with above exception, it lead to the following stackoverflow link and tried to update to the field property from index:not_analyzed to index:no.

But its not updating in the index and getting the same exception again.

Can anyone say how to solve this and how to update the property?

It would be helpful if anyone could provide an example...

Thanks in Advance.....

Community
  • 1
  • 1
Karthick T
  • 123
  • 1
  • 10

2 Answers2

1

You cannot change the mapping for existing fields. So if you want to change the existing mapping be sure that it is possible. The following link can help you t determine if it is possible and if not it has advise on re-indexing using the existing documents.

https://www.elastic.co/blog/changing-mapping-with-zero-downtime

If you have such a big field, and you do not want to search on it, why do you provide it to elasticsearch. Can't you prevent it from being send to elasticsearch at all?

Hope that helps

Miae Kim
  • 1,713
  • 19
  • 21
Jettro Coenradie
  • 4,735
  • 23
  • 31
  • I am trying to replacing a RDBMS with Elasticsearch, so that I could get some performance improvement. Since this data is need to be stored in the elasticsearch for processing but its should not searchable. – Karthick T Aug 06 '14 at 05:39
0

It is not possible to change the mapping of a existing field.