42

I am trying to use Cloud Functions for Firebase to build an API that talks with a Google Cloud SQL (PostgreSQL) instance.

I am using HTTP(S) trigger.

When I white-list my desktop's IP address, I can connect to the Cloud SQL with the function's node.js code from my local machine. But when I deploy, I can't connect, and I can't figure out the HOST IP address of Firebase Function's server, to white-list.

How do you talk to Google Cloud SQL from Cloud Functions for Firebase?

Thanks!

// Code Sample, of what's working on Localhost.
var functions = require('firebase-functions');

var pg = require('pg');
var pgConfig = {
  user: functions.config().pg.user,
  database: functions.config().pg.database,
  password: functions.config().pg.password,
  host: functions.config().pg.host
}

exports.helloSql = functions.https.onRequest((request, response) => {
  console.log('connecting...');
  try {
    client.connect(function(err) {
      if (err) throw err;

      console.log('connection success');
      console.log('querying...');

      client.query('SELECT * FROM guestbook;', function(err, result){
        if (err) throw err;

        console.log('querying success.');
        console.log('Results: ', result);
        console.log('Ending...');

        client.end(function(err){
          if (err) throw err;
          console.log('End success.');
          response.send(result);
        });
      });

    });
  } catch(er) {
    console.error(er.stack)
    response.status(500).send(er);
  }
});
Doug Stevenson
  • 297,357
  • 32
  • 422
  • 441
Quang Van
  • 11,475
  • 12
  • 57
  • 61

7 Answers7

36

I found answer in further discussion of #36388165.

disclaimer: this does not seem to be announced officially, so may change afterward. also I only test in mysql. but nature of this solution, I think same way should work as in pg module (it seems to accept domain socket path as host parameter)

EDIT(2017/12/7): google seems to provide official early access, and same method still works.
EDIT(2018/07/04): it seems that there is someone just copy-and-paste my example code and get into trouble. as google says, you should use connection pool to avoid sql connection leak. (it causes ECONNREFUSE) so I change example code a bit. EDIT(2019/04/04): in below example, using $DBNAME as spanner instance name is confusing, I modify example.

in https://issuetracker.google.com/issues/36388165#comment44 google guy says cloud function instance can talk with cloud sql through domain socket in special path '/cloudsql/$PROJECT_ID:$REGION:$DBNAME'.

I actually can connect and operate cloud SQL from below cloud function code.

const mysql = require('mysql');
const pool = mysql.createPool({
    connectionLimit : 1,
    socketPath: '/cloudsql/' + '$PROJECT_ID:$REGION:$SPANNER_INSTANCE_NAME',
    user: '$USER',
    password: '$PASS',
    database: '$DATABASE'
});
exports.handler = function handler(req, res) {
    //using pool instead of creating connection with function call
    pool.query(`SELECT * FROM table where id = ?`, 
                                req.body.id, function (e, results) {
        //made reply here
    });
};

I hope this would be help for those cannot wait for official announce from google.

takehiro iyatomi
  • 753
  • 1
  • 7
  • 20
  • 1
    this did work. the socket path is the same as the "instance connection name" in the properties of your instance. – kospol Aug 01 '17 at 04:59
  • Does this work with a First Generation Cloud SQL server? I noticed that the instance connection name does not have a region for my first generation server. – rudolph1024 Aug 09 '17 at 19:15
  • @rudolph1024 you have already tried? I don't have 1st gen cloud sql server, so cannot try. but if '$PROJECT_ID:$REGION:$DBNAME' means "instance connection name" as kospol says, it may works. I would appreciate if you try and post the result. – takehiro iyatomi Aug 10 '17 at 03:58
  • @takehiroiyatomi I tried it with $PROJECT_ID:$REGION:$DBNAME and with $PROJECT_ID:$DBNAME, but neither of these worked. – rudolph1024 Aug 10 '17 at 23:58
  • 1
    @rudolph1024 thank you for reporting but I'm sorry to here that. maybe this is the reason google does not seem to announce it yet. – takehiro iyatomi Aug 11 '17 at 07:18
  • Works Perfectly – Yanai Aug 31 '17 at 09:17
  • 1
    @rudolph1024 FYI if you still interested in connecting 1st gen cloud SQL from cloud funtions, official doc (https://docs.google.com/document/d/1XZAohlR2Ew_ShZ6wJE7LeMMk6YhhKTEUoEHz_NKt6Mw) says it possible with instance connection name like ":" – takehiro iyatomi Dec 06 '17 at 18:29
  • don't you need to whitelist ipaddresses to access the cloud sql database? – Cris Dec 08 '17 at 20:33
  • @Cris I don't need to do it. do you? – takehiro iyatomi Dec 11 '17 at 01:00
  • 2
    If you need to connect from another Google Cloud Platform project, add `@appspot.gserviceaccount.com` to your IAM and provide the Cloud SQL Client role. – Wes Cossick Dec 29 '17 at 15:44
  • Wondering it works for connection through firebase free account? – user4092086 Jan 24 '18 at 17:32
  • @user4092086 I don't have firebase project but according to official user guide and https://stackoverflow.com/a/42859932/1982282, Cloud Function for Firebase seems to be thin wrapper of Google Cloud Functions, and there is no caveats about it in official guide. so its likely to work. same as previous case, I would appreciate if you try and post the result. – takehiro iyatomi Jan 29 '18 at 01:06
  • looking at your example, that is the difference between $DBNAME and $DATABASE? – Golden mole Apr 02 '19 at 06:00
  • @Goldenmole certainly this was confusing. $DBNAME is a spanner instance name and $DATABASE is what we called database in other database server product. (eg. mysql). I modify example to clarify these difference. thanks! – takehiro iyatomi Apr 04 '19 at 04:08
  • @takehiroiyatomi, thank you for your response. Since I am still new to GCP, could you explain where I can find value for $SPANNER_INSTANCE_NAME and what it is? I googled spanner instance but I am still not clear about what it is. – Golden mole Apr 04 '19 at 20:08
14

New Answer:

See other answers, it's now officially supported. https://cloud.google.com/functions/docs/sql

Old Answer:

It's not currently possible. It is however a feature request on the issue tracker #36388165:

Connecting to Cloud SQL from Cloud Functions is currently not supported, as the UNIX socket does not exist (causing ENOENT) and there is no defined IP range to whitelist (causing ETIMEDOUT). One possibility is to whitelist 0.0.0.0/0 from the Cloud SQL instance but this is not recommended for security reasons.

If this is an important feature for you I would suggest you visit the issuetracker and star the feature request to help it gain popularity.

Niklas B
  • 1,839
  • 18
  • 36
  • 1
    It is currently possible to connect to Cloud SQL from Cloud Functions easily. There is an official guide as well. Check the other answers. – vovahost May 21 '19 at 03:21
  • Do note that my answer is from 2017, so I don't see the need for downvoting it. I will update it to reflect that it's no longer relevant. – Niklas B Dec 06 '19 at 09:14
  • If your cloud function is in Java and you are following the GCP Doc linked in the answer the following repo in the GCP github also can be useful: https://github.com/GoogleCloudPlatform/cloud-sql-jdbc-socket-factory – Shabirmean Jan 19 '21 at 16:15
14

Find your database region and instance name on GCP > SQL > Instances page:

enter image description here

Save your database password into Firebase environment by running:

$ firebase functions:config:set \
    db.user="<username>" \
    db.password="<password>" \
    db.database="<database>"

Then...

db.js

const { Pool } = require('pg');
const { config } = require('firebase-functions');

const project = process.env.GCP_PROJECT;
const region = 'europe-west1';
const instance = 'db';

module.exports = new Pool({
  max: 1,
  host: `/cloudsql/${project}:${region}:${instance}`,
  ...config().db
});

someFunction.js

const { https } = require('firebase-functions');
const db = require('./db');

module.exports = https.onRequest((req, res) =>
  db
    .query('SELECT version()')
    .then(({ rows: [{ version }]) => {
      res.send(version);
    }));

See also https://stackoverflow.com/a/48825037/82686 (using modern JavaScript syntax via Babel)

Konstantin Tarkus
  • 37,618
  • 14
  • 135
  • 121
10

there's now official documentation for this, still in Beta though as at July 2018

https://cloud.google.com/functions/docs/sql

Ron Chan
  • 3,315
  • 5
  • 27
  • 31
5

CONNECTING FROM GOOGLE CLOUD FUNCTIONS TO CLOUD SQL USING TCP AND UNIX DOMAIN SOCKETS 2020

1.Create a new project

gcloud projects create gcf-to-sql
gcloud config set project gcf-to-sql
gcloud projects describe gcf-to-sql

2.Enable billing on you project: https://cloud.google.com/billing/docs/how-to/modify-project

3.Set the compute project-info metadata:

gcloud compute project-info describe --project gcf-to-sql
#Enable the Api, and you can check that default-region,google-compute-default-zone are not set. Set the metadata.
gcloud compute project-info add-metadata --metadata google-compute-default-region=europe-west2,google-compute-default-zone=europe-west2-b

4.Enable Service Networking Api:

gcloud services list --available
gcloud services enable servicenetworking.googleapis.com

5.Create 2 cloud sql instances, (one with internall ip and one with public ip)- https://cloud.google.com/sql/docs/mysql/create-instance:

6.a Cloud Sql Instance with external ip:

#Create the sql instance in the 
gcloud --project=con-ae-to-sql beta sql instances create database-external --region=europe-west2
#Set the password for the "root@%" MySQL user:
gcloud sql users set-password root --host=% --instance database-external --password root 
#Create a user
gcloud sql users create user_name --host=% --instance=database-external  --password=user_password
#Create a database
gcloud sql databases create user_database --instance=database-external
gcloud sql databases list --instance=database-external

6.b Cloud Sql Instance with internal ip:

i.#Create a private connection to Google so that the VM instances in the default VPC network can use private services access to reach Google services that support it.

gcloud compute addresses create google-managed-services-my-network     --global  --purpose=VPC_PEERING --prefix-length=16  --description="peering range for Google"  --network=default --project=con-ae-to-sql
gcloud services vpc-peerings connect --service=servicenetworking.googleapis.com --ranges=google-managed-services-my-network  --network=default  --project=con-ae-to-sql
#Check whether the operation was successful.
gcloud services vpc-peerings operations describe     --name=operations/pssn.dacc3510-ebc6-40bd-a07b-8c79c1f4fa9a
#Listing private connections
gcloud services vpc-peerings list --network=default --project=con-ae-to-sql
 
ii.Create the instance:

gcloud --project=con-ae-to-sql beta sql instances create database-ipinternal --network=default --no-assign-ip  --region=europe-west2
#Set the password for the "root@%" MySQL user:
gcloud sql users set-password root --host=% --instance database-ipinternal --password root
#Create a user
gcloud sql users create user_name --host=% --instance=database-ipinternal  --password=user_password
#Create a database
gcloud sql databases create user_database --instance=database-ipinternal
gcloud sql databases list --instance=database-ipinternal 


gcloud sql instances list
gcloud sql instances describe database-external
gcloud sql instances describe database-ipinternal
#Remember the instances connectionName

OK, so we have two mysql instances, we will connect from Google Cloud Functions to database-ipinternal using Serverless Access and TCP, and from Google Cloud Functions to database-external using unix domain socket.

7.Enable the Cloud SQL Admin API

gcloud services list --available
gcloud services enable sqladmin.googleapis.com

Note: By default, Cloud Functions does not support connecting to the Cloud SQL instance using TCP. Your code should not try to access the instance using an IP address (such as 127.0.0.1 or 172.17.0.1) unless you have configured Serverless VPC Access.

8.a Ensure the Serverless VPC Access API is enabled for your project:

gcloud services enable vpcaccess.googleapis.com

8.b Create a connector:

gcloud compute networks vpc-access connectors create serverless-connector --network default --region europe-west2 --range 10.10.0.0/28
#Verify that your connector is in the READY state before using it
gcloud compute networks vpc-access connectors describe serverless-connector --region europe-west2

9.Create a service account for your cloud function. Ensure that the service account for your service has the following IAM roles: Cloud SQL Client, and for connecting from App Engine Standard to Cloud Sql on internal ip we need also the role Compute Network User.

gcloud iam service-accounts create cloud-function-to-sql
gcloud projects add-iam-policy-binding gcf-to-sql --member serviceAccount:cloud-function-to-sql@gcf-to-sql.iam.gserviceaccount.com   --role roles/cloudsql.client
gcloud projects add-iam-policy-binding gcf-to-sql --member serviceAccount:cloud-function-to-sql@gcf-to-sql.iam.gserviceaccount.com  --role roles/compute.networkUser

Now that I configured the set up

1. Connect from Google Cloud Functions to Cloud Sql using Tcp and unix domanin socket

cd app-engine-standard/
ls
#main.py requirements.txt

cat requirements.txt
sqlalchemy
pymysql
      
cat main.py 
import pymysql
from sqlalchemy import create_engine


 def gcf_to_sql(request):

    engine_tcp = create_engine('mysql+pymysql://user_name:user_password@10.36.0.3:3306')
    existing_databases_tcp = engine_tcp.execute("SHOW DATABASES;")
    con_tcp = "Connecting from Google Cloud Functions to Cloud SQL using TCP: databases => " + str([d[0] for d in existing_databases_tcp]).strip('[]') + "\n"
    engine_unix_socket = create_engine('mysql+pymysql://user_name:user_password@/user_database?unix_socket=/cloudsql/gcf-to-sql:europe-west2:database-external')
    existing_databases_unix_socket = engine_unix_socket.execute("SHOW DATABASES;")
    con_unix_socket = "Connecting from Google Cloud Function  to Cloud SQL using Unix Sockets: tables in sys database:  => " + str([d[0] for d in existing_databases_unix_socket]).strip('[]') + "\n"
    return con_tcp + con_unix_socket
     

2.Deploy the cloud function:

gcloud beta functions deploy gcf_to_sql --runtime python37 --region europe-west2 --vpc-connector projects/gcf-to-sql/locations/europe-west2/connectors/serverless-connector  --trigger-http
 

3.Go to Cloud Function, choose gcf-to-sql, Testing, TEST THE FUNCTION:

#Connecting from Google Cloud Functions to Cloud SQL using TCP: databases => 'information_schema', 'mysql', 'performance_schema', 'sys', 'user_database'
#Connecting from Google Cloud Function  to Cloud SQL using Unix Sockets: tables in sys database:  => 'information_schema', 'mysql', 'performance_schema', 'sys', 'user_database'

SUCCESS!

d-_-b
  • 21,536
  • 40
  • 150
  • 256
marian.vladoi
  • 7,663
  • 1
  • 15
  • 29
0

Cloud Functions - Supported Services - I don't see Cloud SQL on this list, so perhaps it's not supported yet.

Quang Van
  • 11,475
  • 12
  • 57
  • 61
0

You can also authorize Firebase IP addresses range since we don't really know which IP address Firebase use externally.

I've experimented on it. Google Cloud SQL DOES NOT USE internal IP addresses. So, you CAN NOT use 10.128.0.0/20 to allow internal IP addresses for your Google Cloud SQL.

Answer

So from the console, go to Google Cloud SQL > Instance > Authorization, you can add:

151.101.0.0/17

Which will allow you 151.101.0.0 to 151.101.127.255 IP address range, wherein the Firebase server domain is currently 151.101.1.195 and 151.101.65.195.

I'm not sure if this IP address is ever going to change.

Also, make sure that your Cloud SQL Database is using us-central zone. Firebase seems to be available in us-central.

Franz Noel
  • 1,820
  • 2
  • 23
  • 50