15

I am trying to run the query "select * from tablename ". But it throws error like "Error: Response too large to return".

I was able to process some other table which contains TB of data. But I am getting this error for the table which contain 294 MB.

I was able to select the table by selecting the column name with some limitation not able to process all the column in select query. In my select query I have totally 26 column but I was able to select 16 column without error. "select column1,column2,column3,....column16 from tablename".

Is there any relation with column and size of the table.

Please help me to fix this issue.

Big Query table details:

Total Records: 683,038

Table Size: 294 MB

No of Column: 26

Viswanathan
  • 149
  • 1
  • 1
  • 4

4 Answers4

18

Set allowLargeResults to true in your job configuration. You must also specify a destination table with the allowLargeResults flag.

If querying via API,

"configuration": 
  {
    "query": 
    {
      "allowLargeResults": true,
      "query": "select uid from [project:dataset.table]"
      "destinationTable": [project:dataset.table]

    }
  }

If using the bq command line tool,

$ bq query --allow_large_results --destination_table "dataset.table" "select uid from [project:dataset.table]"

If using the browser tool,

  • Click 'Enable Options'
  • Select 'Allow Large Results'
Shayan Masood
  • 1,057
  • 10
  • 20
  • 1
    Can you please update me how we can use these same option in Big Query Browser tool. – Viswanathan Dec 02 '13 at 05:22
  • This is how you enable large results from Browser Tool -> Click on Enable Options, select the table in which you want to save the result then check Allow large Results – user1302884 May 07 '14 at 09:51
  • I have a doubt that is i run this query this will create new table every time...and when i query this table again it throws ALLOWALLRESULT:tRUE. – arjun kori Oct 17 '16 at 07:42
2
jobData = {'configuration': {'query': {'query': sql, 
            'allowLargeResults': 'true',
            'destinationTable':{ 
                        "projectId": "projectXYZ",
                        "tableId": "tableXYZ", 
                        "datasetId": "datasetXYZ", 
                                }
                        }}}  

You can use 'writeDisposition' to specify whether to overwrite the destination table or not.

'writeDisposition':'WRITE_TRUNCATE' # If the table already exists, 
                                    # BigQuery overwrites the table data.
'writeDisposition':'WRITE_APPEND'   # If the table already exists, 
                                    # BigQuery appends the data to the table
DoOrDoNot
  • 1,132
  • 1
  • 10
  • 22
2

What if you trying adding a LIMIT clause to your query?

Dewald Abrie
  • 1,392
  • 9
  • 21
-1

Try setting allowLargeResults to true in your job configuration if you haven't already.

I am new to Big Query.Can you please update me where can I do these job configuration changes.

Viswanathan
  • 149
  • 1
  • 1
  • 4
  • 1
    How are you running the query? Are you using the web interface? If so, you can click the 'enable options' button to show advanced query options, then select "allow large results". Note that you have to specify a destination table when you use "allowLargeResults" – Jordan Tigani Dec 01 '13 at 23:45
  • 1
    @ Jordan Tigani...i am using bigquery.it does not return whole data..CAN YOU HELP ME SOLVE THIS PROBLEM http://stackoverflow.com/questions/39424391/how-to-fragment-bigquery-response-into-10000-in-every-request – arjun kori Sep 10 '16 at 13:10