1

I want to connect to a big query project from my aws-ec2 instance, running ubuntu, using r, rstudio and the tidyverse framework. I have no problem connecting from my local computer, running windows 10:

library(bigrquery)
library(tidyverse)

proyect <- "api-project-456789012345"   ###### ID of the project
table = "`api-project-999999999.88888888.ga_sessions_*`"

con_bigquery <- DBI::dbConnect(bigquery(),
                               dataset = "some_project",
                               project = proyect,
                               billing = proyect)

query<-"select basic_cols
        from table"

basic_cols = c(
  "date",
  "clientId",
  "visitNumber",
  "channelGrouping")

query<- query %>% 
str_replace("basic_cols",paste(basic_cols,collapse = ',')) %>% 
str_replace("table",table)


df_raw<-tbl(con_bigquery,sql(query))

When I run the code above locally, once it gets to the tbl method, a pop-up window appears guiding me through an authentification process:

First step of auth: auth-part-1

Second step of auth: auth-part-2

Third step of auth: auth-part-3

Fourth step of auth: auth-part-4

Final message: auth-end-part

And after the last message, I'm able to work with the db without a problem. However, when I try to do the same on my ec2 instance running ubuntu, I run into the same authentification process but instead of getting a successful message at the end, all the previous steps 1-4 work fine, I run into this:

This site can’t be reachedlocalhost refused to connect.
Search Google for localhost 1410
ERR_CONNECTION_REFUSED

Trying to find a solution I came across this site: https://medium.com/@_orcaman/connecting-to-google-bigquery-from-aws-sagemaker-eb0e3c7556de Where it is suggested to register the environment variable GOOGLE_APPLICATION_CREDENTIALS on my instance like this:

export GOOGLE_APPLICATION_CREDENTIALS="/home/user/Downloads/[FILE_NAME].json"

Which I did, but it didn't work.

I would appreciate any kind of help.

Milenko
  • 19
  • 3
  • I'll mark it as duplicate, unless/until Milenko comes back and tells us more about why it's not. Happy to keep digging if that's the case – Felipe Hoffa Sep 09 '19 at 22:09
  • I tried to do what it was suggested in the link and it didn't work. – Milenko Sep 16 '19 at 01:07
  • Please help us reproduce the behavior. – Felipe Hoffa Sep 16 '19 at 05:20
  • Felipe, I'm trying to access a big query project running rstudio from an aws instance running ubuntu. I don't know what else to provide to help to reproduce the problem. Also, I don't understand why the question was marked as a duplicate. The links that were provided are from an issue that although related to my question, is not the same. – Milenko Sep 23 '19 at 20:37
  • Hi Milenko! This question was marked as a duplicate, as you posted this question on September 6th - (I think Tim Swast left a comment, but the comment is not there anymore?) - and then you just came back 10 days later! Please give us more context. O maybe it's a bug? Maybe posting it as an issue on https://github.com/r-dbi/bigrquery/issues/22 would bring the appropriate developers? I'd love to find an answer to this problem too! – Felipe Hoffa Sep 24 '19 at 05:47
  • Felipe, can you connect to a big query project from an ec2 instance using dbConnect()? – Milenko Sep 29 '19 at 20:12
  • I don't have access to an ec2 instance... can you open an issue here? https://github.com/r-dbi/bigrquery/issues/161 – Felipe Hoffa Sep 29 '19 at 22:21
  • 1
    At the end I found the solution here: https://github.com/tidyverse/googledrive/issues/79 – Milenko Oct 02 '19 at 19:21

0 Answers0