1

I am getting the following error when scanning a large number of rows(~1 million) based on a prefix

Status{code=DEADLINE_EXCEEDED, description=deadline exceeded: -646391190127 ns from now, cause=null}

Could someone suggest what are the BigtableOptions I should configure so that I don't see the DEADLINE_EXCEEDED error. I have tried setting BIGTABLE_RPC_TIMEOUT_MS_KEY and BIGTABLE_LONG_RPC_TIMEOUT_MS_KEY to large values but still getting this error.

Mukyuu
  • 6,436
  • 8
  • 40
  • 59
Nithin
  • 415
  • 4
  • 14
  • Have you tried changing the [BIGTABLE_READ_RPC_TIMEOUT_MS_KEY](https://cloud.google.com/bigtable/docs/hbase-client/javadoc/com/google/cloud/bigtable/hbase/BigtableOptionsFactory.html#BIGTABLE_READ_RPC_TIMEOUT_MS_KEY)? – pessolato Dec 18 '19 at 15:07
  • yup tried setting that as well – Nithin Dec 19 '19 at 00:52
  • I honestly think that the issue is more related to the way you designed your schema. Have you taken into consideration the best practices mentioned here : https://cloud.google.com/bigtable/docs/schema-design#row-key-prefixes . Are you using any type of keys mentioned in "row keys to avoid" rubric ? – Andrei Tigau Dec 30 '19 at 09:10
  • Did you ever end up finding a solution? – Zee Oct 04 '22 at 16:54

0 Answers0