I am trying to copy into a redshift table some data that happen to display some UTF-8 characters (Portuguese accented letters).
I create a table setting VARCHAR(256) data type where needed, as suggested in other answers.
When I copy the table, all the special characters are uploaded as question marks (e.g. "Boleto de cobran?a") as per specification ACCEPTINVCHARS AS '?'.
To preserve the special characters I encoded my cvs file as UTF-8 using Blocknotes, but when i do it all of a sudden I start getting errors and I am no longer able to copy the data.
Code wise i used:
dbGetQuery(myRedshift$con, "CREATE TABLE <NAME_OF_MY_TABLE> (
<VARIABLE_1> VARCHAR(256) sortkey distkey,
<VARIABLE_2> NUMERIC(10,2),
<VARIABLE_3> INTEGER)")
dbGetQuery(myRedshift$con, "<NAME_OF_MY_TABLE> from
's3://<S3_FOLDER_NAME>/<DATABASE_NAME.csv'
CREDENTIALS 'aws_access_key_id=<MY_KEY>;aws_secret_access_key
<MY_SA_KEY>'
region '<S3_REGION>'
CSV DELIMITER ','
NULL '#NULO'
IGNOREHEADER 1")
The error i get is the following;
Error in postgresqlExecStatement(conn, statement, ...) :
RS-DBI driver: (could not Retrieve the result : ERROR: Load into table '' failed. Check 'stl_load_errors' system table for details.
)
NULL
Warning message:
In postgresqlQuickSQL(conn, statement, ...) :
Could not create executecopy from
's3://s3:///;aws_secret_access_key='
region ''
CSV DELIMITER ','
NULL '#NULO'
IGNOREHEADER 1
Thanks for any help. Best,