0

We're using bigquery streaming API. All went well until recently (no code change) - In the last few hours we get many errors like:

"The API call urlfetch.Fetch() took too long to respond and was cancelled. Traceback (most recent call last): File "/base/data/home/runtimes/python27"

or

"Deadline exceeded while waiting for HTTP response from URL"

The insert call is done on a python deferred process and is retried again after a wait.

Questions:

  • How can we check if it's our internal issue or a general problem with big query?
  • Can we increase the 5000 timeout?
James
  • 153
  • 1
  • 7

1 Answers1

2

Are you running in appengine? If so, you can do this:

from google.appengine.api import urlfetch
urlfetch.set_default_fetch_deadline(60)

That said, streaming ingestion shouldn't be anywhere close to the default 5 second error. There was a networking configuration issue with streaming ingestion, it should be resolved now.

Are you still seeing the issues?

Jordan Tigani
  • 26,089
  • 4
  • 60
  • 63
  • Yes. Using App engine, but I don't directly import urlfetch. Where should I add the import? BTW - the frequency of errors dropped significantly but still had 2 such errors today above 5000ms. average call takes 500-600ms and may peak ~1000ms. – James Apr 04 '14 at 20:02
  • Using similar code to the python example given in the accepted answer here: [link](http://stackoverflow.com/questions/22049248/how-to-use-bigquery-streaming-insertall-on-app-engine-python). That's why I don't know where to add the function you've mentioned above to extend the urlfetch default deadline. – James Apr 04 '14 at 20:14
  • does ~600ms to insert ~90 fields (one raw) into bigquery in streaming make sense? – James Apr 08 '14 at 17:00