It's been a while since I've had a chance to work on the pandas GBQ module, but I noticed that one of our regression tests is now failing.
The test in question is:
https://github.com/pydata/pandas/blob/master/pandas/io/tests/test_gbq.py#L254-L267
In short, the test attempts to create a table with 5 columns (types are Boolean, Float, String, Integer, Timestamp) and 1,000,001 rows each. Inserting these rows in chunks of 10,000 rows is failing with a response of "Request Too Large".
I feel like this is going to probably have a similar answer to Getting "Query too large" in BigQuery - but seeing as how this test was working at a previous time, I'm wondering if there's a backend problem that needs to be addressed. It's also possible the API was changed when I wasn't looking!
TLDR Version: What about our insertion is too large, and are there documented limits that we can reference?