1

I'm able to successfully connect to my Redshift cluster via my IAM role with the following from the redshift_connector package:

import redshift_connector

conn = redshift_connector.connect(
    iam=True,
    database='dbname',
    db_user='user',
    password='',
    cluster_identifier='clusterID',
    profile='aws_credential_profile',
    region='region',
    ssl=True,
    ssl_insecure=True
 )

However, when I try to recreate this with SQLAlchemy and run via ipython-sql I a) can't run without a user and just a dbuser and b) get authentication errors. Are there any configurations I'm missing?

url="""redshift+redshift_connector://hostname:post/dbname?ssl=true&ssl_insecure=true&iam=true&region=region&cluster_identifier=clusterid&profile=aws_credential_profile&db_user=user"""
%sql $url
            
econgineer
  • 1,117
  • 10
  • 20

1 Answers1

0

You can run the connection to Redshift in a similar manner to what you've tried but with a different driver. Check out this guide. Please make sure before you do, you have the right credentials to connect into your DWH.

Essentially, you need to make sure you're using psycopg2 as follows (I've tested it using Jupysql but ipython-sql works as well):

pip install psycopg2-binary sqlalchemy jupysql --quiet
%load_ext sql
%config SqlMagic.displaylimit = 5 # This is optional, if you don't want to burden your view.
%sql postgresql+psycopg2://user_name:password@host_name:port_num/db

Then query it. There's a guide on this driver here, sometimes psycopg2 has issues so you can use other drivers like psycopg2-binary & pg8000.

Also, if you are still not able to get it to work, try doing it straight from SQL alchemy, sometimes the error messages are clearer and it's easy to understand what's going wrong with the connection request.

Ido Michael
  • 109
  • 7