I am trying to deploy a python code with google cloud funtions and scheduler that writes a simple table to a google postgresql cloud database.
- I created a postgre sql database
- Added a Cloud SQL client role to the App Engine default service account
- Created a Cloud Pub/Sup topic
- Created a scheduler job.
So far so good. Then I created the following function main.py :
from sqlalchemy import create_engine
import pandas as pd
from google.cloud.sql.connector import Connector
import sqlalchemy
connector = Connector()
def getconn():
conn = connector.connect(
INSTANCE_CONNECTION_NAME,
"pg8000",
user=DB_USER,
password=DB_PASS,
db=DB_NAME
)
return conn
pool = sqlalchemy.create_engine(
"postgresql+pg8000://",
creator=getconn,
)
def testdf(event,context):
df=pd.DataFrame({"a" : [1,2,3,4,5],
"b" : [1,2,3,4,5]})
df.to_sql("test",
con = pool,
if_exists ="replace",
schema = 'myschema')
And the requirements.txt contains:
pandas
sqlalchemy
pg8000
cloud-sql-python-connector[pg8000]
When I test the function it always timeout. No error, just these logs:
I cant figure it out why. I have tried several code snippets from:
https://colab.research.google.com/github/GoogleCloudPlatform/cloud-sql-python-connector/blob/main/samples/notebooks/postgres_python_connector.ipynb#scrollTo=UzHaM-6TXO8h
and from
https://codelabs.developers.google.com/codelabs/connecting-to-cloud-sql-with-cloud-functions#2
I think the adjustments of the permissions and roles cause the timeout. Any ideas?
Thx