I have tried two ways with Bigrquery package such that
library(bigrquery)
library(DBI)
con <- dbConnect(
bigrquery::bigquery(),
project = "YOUR PROJECT ID HERE",
dataset = "YOUR DATASET"
)
test<- dbGetQuery(con, sql, n = 10000, max_pages = Inf)
and
sql <- `YOUR LARGE QUERY HERE` #long query saved to View and its select here
tb <- bigrquery::bq_project_query(project, sql)
bq_table_download(tb, max_results = 1000)
but failing to the error "Error: Requested Resource Too Large to Return [responseTooLarge]"
, potentially related issue here, but I am interested in any tool to get the job done: I tried already the solutions outlined here but they failed.
How can I load large datasets to R from BigQuery?