I'm evaluating elascticsearch and I generated a bunch of fake data. The amount field is defined as a double. Here's the mapping "authamount": { "type": "double" }, etc...
In the java code that does the random number I specify 2 decimal places and the data looks ok in elasticsearch.
When I run a stats query as follows:
{
"query" : { "constant_score": { "filter": {
"range": {
"txndatestring": {
"gte": "2017-01-01T15:44:04.068Z",
"lte": "2017-01-31T15:44:04.068Z"
}
}
}
}
},
"aggs" : { "auth_amount_stats" : { "stats" : { "field" : "authamount" } }
}
}
I see this result:
"aggregations": {
"auth_amount_stats": {
"count": 20810,
"min": 5.03,
"max": 1474.24,
"avg": 734.682198942815,
"sum": 15288736.559999982
}}
I don't understand how the sum can have so many decimal places with a sum.