1

I can't figure out the syntax to connect my AEF environment to multiple cloud_sql instances.

These are the two configs that I have tried:

beta_settings:
    cloud_sql_instances: 
         - pore-gdic:europe-west1:a-django
         - pore-gdic:europe-west1:a-airflow-5

Failed:

ERROR: (gcloud.app.deploy) Error Response: [13] Invalid Cloud SQL name: []

based on the response from this question: Connecting to multiple CloudSQL instances using Cloud sql proxy?

beta_settings:
     cloud_sql_instances: 
       pore-gdic:europe-west1:a-django,pore-gdic:europe-west1:a-airflow-5

Doesn't fail on deployment, but doesn't work as a webpage at all.

Does anyone have a working solution to this problem.

For completeness, one db is running MySQL and the other postgres. So ports isn't an issue.

Daniel Lee
  • 7,189
  • 2
  • 26
  • 44

2 Answers2

3

In the GAE Flex enviroment, you are using the correct syntax for your app.yaml:

beta_settings:
    cloud_sql_instances: 
        pore-gdic:europe-west1:a-django,pore-gdic:europe-west1:a-airflow-5

Your problem is most likely you are using the wrong connection url in your app. With the above in your app.yaml, you are instructing GAE to use Unix sockets to connect to your database. Your connection should be something like the following:

mysql://USERNAME:PASSWORD@/DATABASE?unix_socket=/cloudsql/pore-gdic:europe-west1:a-django

postgresql://USERNAME:PASSWORD@/DATABASE?host=/cloudsql/pore-gdic:europe-west1:a-airflow-5

If you want to test on a local machine without changing things, you can install the cloud-sql proxy. This way running your app locally connects to the same way as when deployed. Use the following to start the proxy before your app:

./cloud_sql_proxy -dir=/cloudsql -instances=pore-gdic:europe-west1:a-django,pore-gdic:europe-west1:a-airflow-5

This will cause it to create the sockets in /cloudsql, which is the above connection urls specify (ex: /cloudsql/pore-gdic:europe-west1:a-django).

This section on the CloudSQL proxy page has more information on the difference between using Unix sockets and TCP to connect when using multiple instances.

kurtisvg
  • 3,412
  • 1
  • 8
  • 24
0

Since I couldn't figure this out.

I exposed the airflow db to 0.0.0.0/0. The information in this database is not sensitive, but to thwart attackers, I updated the root password to a 64 digit password generated from 1pass:

Something akin to: XC$YvCBJ{avigR^LibD#Nn7i9MrU3qpH}GVcD(i4]9)Lg7)KZwT3xfQ)GW2z3rt4.

This should take until the sun burns out to crack.

Then I connect to the mysql instance using this password, the host ip, the database to connect to. These are read from settings.py

Something like:

def trigger_dag_master():
"""
Trigger the dag_master dag running in airflow.
    Connect to cloudsql instance running airflow using cloudsql proxy.
    insert a row that triggers the dag master.
    based on an insert pattern found in: https://github.com/apache/incubator-airflow/blob/master/airflow/api/common/experimental/trigger_dag.py
:return: True if successful
"""

    def insert(connection: MySQLdb.Connect, query: str) -> bool:
        """
        Insert query into MYSQL database.
            handles rollback if failure
        :param connection: mysql connection
        :param query: query to execute
        :return: True if successful
        """
        try:
            cursor = connection.cursor()
            cursor.execute(query)
            connection.commit()
            return True
        except Exception as e:
            logger.error(e.with_traceback(e.__traceback__))
            connection.rollback()
            return False

assert all([settings.AIRFLOW_PASSWORD, settings.AIRFLOW_USER,
            settings.AIRFLOW_HOST, settings.AIRFLOW_DB])

mysql_connection = MySQLdb.connect(settings.AIRFLOW_HOST,
                                   settings.AIRFLOW_USER,
                                   settings.AIRFLOW_PASSWORD,
                                   settings.AIRFLOW_DB)
....

Hope this helps someone.

Daniel Lee
  • 7,189
  • 2
  • 26
  • 44