2

I have access to a project within BigQuery. I'm looking to create a partitioned table by ingestion time, partitioned by day, then set up a BigQuery Data Transfers process that brings avro files in from multiple directories within a Google Cloud Storage Bucket.

I'm able to create a non partitioned table in BigQuery, and setup the Transfers correctly so that the table gets updated regularly. However when I have set up the table (via the console) to be partitioned by day on ingestion time, the same transfer process does not work.

I receive the error: "Failed to start job for table xxxxxxx with error INVALID_ARGUMENT: Incompatible table partitioning specification. Expects partitioning specification interval(type:day), but input partitioning specification is ; JobID: xxxxxxxxxxxx:bqts_xxxxxxxx-0000-xxxx-xxxx-xxxxxxxxxxxx"

I think using the command line is one possible solution, but are there any alternatives - e.g. by only using the console?

MRHarv
  • 491
  • 1
  • 10
  • 22
  • Thank you for using our service. Sorry this is bug on our end and it's already fixed, we are rolling out the fix. It will be available to all in 2 weeks. We do support creating the destination table with partitioning and clustering info, but this feature is in Alpha now. Please IM me if you would like to try it out. – Deen酱 Nov 20 '20 at 23:49
  • I am using BigQuery Data Transfer Service to transfer PARQUET file from Amazon S3. But due to BigQuery Amazon S3 data transfer is using write-append ONLY, what I need is actually write-truncate for daily transfer. So I created a partitioned table by ingestion time partitioned by day as my destination table. The last partition would be the copy of the data I need. I got the same error "Failed to start job ... Expects partitioning specification interval(type:day), but input partitioning specification is ; JobID: xxxxxxx:bqts_xxxxxxxx" – searain Nov 27 '20 at 17:25
  • My questions: 1) " fix will be available to all in 2 weeks" is it still on schedule? are we only 1 week away from fix available? 2) BigQuery Amazon S3 file transfer is using "write-append", but I need "write-truncate", so I use partition to get around, . Are there any better solutions you can suggest for the users that need "write-truncate"? 3) How to IM you or contact support team if we want to be on the try out list too? – searain Nov 27 '20 at 17:27
  • 1
    I saw this question is closed at this time due to it is not reproducible. This error is reproducible and I run into the same error. – searain Dec 01 '20 at 21:22

0 Answers0