2

I have a simple SQL query in Elasticsearch which I know returns less than 100 rows of results. How can I get all these results at once (i.e., without using scroll)? I tried the limit n clause but it works when n is less than or equal to 10 but doesn't work when n is great than 10.

The Python code for calling the Elasticsearch SQL API is as below.

import requests
import json

url = 'http://10.204.61.127:9200/_xpack/sql'
headers = {
   'Content-Type': 'application/json',
}
query = {
    'query': '''
        select
            date_start,
            sum(spend) as spend
       from
           some_index
       where
           campaign_id = 790
           or
           campaign_id = 490
       group by
           date_start
   '''
}
response = requests.post(url, headers=headers, data=json.dumps(query))

The above query returns a cursor ID. I tried to feed the cursor ID into the same SQL API but it doesn't gave me more result.

I also tried to translated the above SQL query to native Elasticsearch query using the SQL translate API and wrapped it into the following Python code, but it doesn't work either. I still got only 10 rows of results.

import requests
import json


url = 'http://10.204.61.127:9200/some_index/some_doc/_search'
headers = {
    'Content-Type': 'application/json',
}
query = {
    "size": 0,
    "query": {
        "bool": {
            "should": [
                {
                    "term": {
                        "campaign_id.keyword": {
                            "value": 790,
                            "boost": 1.0
                        }
                    }
                },
                {
                    "term": {
                        "campaign_id.keyword": {
                            "value": 490,
                            "boost": 1.0
                        }
                    }
                }
            ],
            "adjust_pure_negative": True,
            "boost": 1.0
        }
    },
    "_source": False,
    "stored_fields": "_none_",
    "aggregations": {
        "groupby": {
            "composite": {
                "size": 1000,
                "sources": [
                    {
                        "2735": {
                            "terms": {
                                "field": "date_start",
                                "missing_bucket": False,
                                "order": "asc"
                            }
                        }
                    }
                ]
            },
            "aggregations": {
                "2768": {
                    "sum": {
                        "field": "spend"
                    }
                }
            }
        }
    }
}
response = requests.post(url, headers=headers, data=json.dumps(query)).json() 
Benjamin Du
  • 1,391
  • 1
  • 17
  • 25

3 Answers3

2
POST _sql?format=json
{
  "query": "SELECT field1, field2 FROM indexTableName ORDER BY field1",
  "fetch_size": 10000
}

The above query will return a cursor in the response, which needs to be passed in the next call.

POST _sql?format=json
{
  "cursor": "g/W******lAAABBwA="
}

This resembles the normal scroll method in Elasticsearch

Vignesh G
  • 151
  • 6
0

elasticsearch has limited but if you are using of python you can use of elasticsearc-dsl

from elasticsearch_dsl import Search

q = Q('term', Frequency=self._frequency)
q = q & Q("range", **{'@timestamp': {"from": self._start, "to": self._end}})

Search().query(q).scan()
Mohammad reza Kashi
  • 321
  • 1
  • 5
  • 17
0

With elasticsearch-sql, LIMIT 100 should translate to "size": 100 in traditional query DSL. This will return up to 100 matching results.

Given this request:

POST _xpack/sql/translate
{
  "query":"SELECT FlightNum FROM flights LIMIT 100"
}

The translated query is:

{
  "size": 100,
  "_source": {
    "includes": [
      "FlightNum"
    ],
    "excludes": []
  },
  "sort": [
    {
      "_doc": {
        "order": "asc"
      }
    }
  ]
}

So syntax-wise, LIMIT N should do what you expect it to. As to why you're not seeing more results, this is likely something specific to your index, your query, or your data.

There is a setting index.max_result_window which can cap the size of a query, but it defaults to 10K and also should return an error rather than just limiting the results.

Eric Damtoft
  • 1,353
  • 7
  • 13