2

I am using Google BigQuery's API with R thanks to bigrquery R package.

I am facing an issue that appears (randomly) when I am using insert_upload_job() function to store my dataset on BigQuery.

When I run this function, the red button appears in the console (= job is running) but sometimes, nothing appends and after long minutes I have an error message because of the Timeout : Error 408 (Request Timeout).

It seems the error is appearing more often when my dataset size is around 500 000 Bytes (= 0,5 MB)

Edit: When I restart R, and retry it sometimes works... That makes no sens...

Do you have any idea on what could explain this? NB : The error is not explained on the Google Big Query documentation https://cloud.google.com/bigquery/docs/error-messages

Thanks in advance!

Remi
  • 961
  • 1
  • 13
  • 25
  • Please check the following link for help https://cloud.google.com/blog/products/gcp/google-cloud-platform-for-data-scientists-using-r-with-google-bigquery-part-2-storing-and-retrieving-data-frames – Nareman Darwish Dec 31 '19 at 19:34
  • 1
    Thanks @NaremanDarwish, Actually I already followed this tutorial. Everything works perfectly on my side when I tried the example included in the tutorial. Everything works great when I m uploading my own (less than 300 000 bytes) datasets on BigQuery unless I try to upload biggest datasets (around 500 000 bytes). – Remi Jan 01 '20 at 14:57
  • Can you post a piece of your code for the insert_upload_job() function? – Nick_Kh Jan 02 '20 at 07:19
  • 1
    I'm experiencing a very similar thing. I get the 408 error with `bigrquery::bq_table_upload()`, however. Restarting the R session sometimes makes the job work, as you described. – Alfredo Hernández Jul 02 '20 at 12:21

0 Answers0