3

I am trying to download data from bigquery table which has 3 million records. I get the error

 "response too large to return, try will allow_large_results = true"

I tried with the below command:

  df = bq.Query('SELECT * FROM [Test.results]', allow_large_results = True).to_dataframe()

Any help would be greatly appreciated.

user3447653
  • 3,968
  • 12
  • 58
  • 100
  • although quite old: maybe [this](https://stackoverflow.com/questions/20287938/error-response-too-large-to-return-in-big-query) could be helpful. – Niklos Mar 18 '20 at 11:28

2 Answers2

1

The way to retrieve result of query that is expected to be bigger than ~128MB is to issue query insert job api with destination table and allow large result flag. After result is stored in that table you can retrieve it using tabledata.list job. Of course than you can delete that [intermediate] table Hope you can identify respective syntax in client you are using

Mikhail Berlyant
  • 165,386
  • 8
  • 154
  • 230
0

This is quite old, but for those who land here, the way to do it is:

from google.cloud import bigquery

...

client = bigquery.Client()
job_config = bigquery.job.QueryJobConfig(allow_large_results=True)

q = client.query("""SELECT * FROM [Test.results]""", job_config=job_config)
r = q.result()

df = r.to_dataframe()

From the docs here.

  • I don't think that actually works for large results - I still see the: `Response too large to return. Consider setting allowLargeResults to true in your job configuration.` error message. – Mark Kelly Dec 31 '20 at 03:11