I'm trying to push (with COPY) a big file from s3 to Redshift. Im using sqlalchemy in python to execute the sql command but it looks that the copy works only if I preliminary TRUNCATE the table.
the connection works ok:
from sqlalchemy import create_engine
engine = create_engine('postgresql://XXXX:XXXX@XXXX:XXXX/XXXX')
with this command string (if I truncate the table before the COPY command)
toRedshift = "TRUNCATE TABLE public.my_table; COPY public.my_table from 's3://XXXX/part-p.csv' CREDENTIALS 'aws_access_key_id=AAAAAAA;aws_secret_access_key=BBBBBBB' gzip removequotes IGNOREHEADER 0 delimiter '|';"
engine.execute(toRedshift)
If I remove the "TRUNCATE TABLE public.my_table;" bit
toRedshift = "COPY public.my_table from 's3://XXXX/part-p.csv' CREDENTIALS 'aws_access_key_id=AAAAAAA;aws_secret_access_key=BBBBBBB' gzip removequotes IGNOREHEADER 0 delimiter '|';"
engine.execute(toRedshift)
But the command works perfectly in with any other SQL client (like DBeaver for example)