I want to connect to a big query project from my aws-ec2 instance, running ubuntu, using r, rstudio and the tidyverse framework. I have no problem connecting from my local computer, running windows 10:
library(bigrquery)
library(tidyverse)
proyect <- "api-project-456789012345" ###### ID of the project
table = "`api-project-999999999.88888888.ga_sessions_*`"
con_bigquery <- DBI::dbConnect(bigquery(),
dataset = "some_project",
project = proyect,
billing = proyect)
query<-"select basic_cols
from table"
basic_cols = c(
"date",
"clientId",
"visitNumber",
"channelGrouping")
query<- query %>%
str_replace("basic_cols",paste(basic_cols,collapse = ',')) %>%
str_replace("table",table)
df_raw<-tbl(con_bigquery,sql(query))
When I run the code above locally, once it gets to the tbl method, a pop-up window appears guiding me through an authentification process:
First step of auth: auth-part-1
Second step of auth: auth-part-2
Third step of auth: auth-part-3
Fourth step of auth: auth-part-4
Final message: auth-end-part
And after the last message, I'm able to work with the db without a problem. However, when I try to do the same on my ec2 instance running ubuntu, I run into the same authentification process but instead of getting a successful message at the end, all the previous steps 1-4 work fine, I run into this:
This site can’t be reachedlocalhost refused to connect.
Search Google for localhost 1410
ERR_CONNECTION_REFUSED
Trying to find a solution I came across this site: https://medium.com/@_orcaman/connecting-to-google-bigquery-from-aws-sagemaker-eb0e3c7556de Where it is suggested to register the environment variable GOOGLE_APPLICATION_CREDENTIALS on my instance like this:
export GOOGLE_APPLICATION_CREDENTIALS="/home/user/Downloads/[FILE_NAME].json"
Which I did, but it didn't work.
I would appreciate any kind of help.