0

I am trying to copy into a redshift table some data that happen to display some UTF-8 characters (Portuguese accented letters).

I create a table setting VARCHAR(256) data type where needed, as suggested in other answers.

When I copy the table, all the special characters are uploaded as question marks (e.g. "Boleto de cobran?a") as per specification ACCEPTINVCHARS AS '?'.

To preserve the special characters I encoded my cvs file as UTF-8 using Blocknotes, but when i do it all of a sudden I start getting errors and I am no longer able to copy the data.

Code wise i used:

dbGetQuery(myRedshift$con, "CREATE TABLE <NAME_OF_MY_TABLE> (
<VARIABLE_1>                        VARCHAR(256) sortkey distkey,            
<VARIABLE_2>                        NUMERIC(10,2),
<VARIABLE_3>                        INTEGER)")

dbGetQuery(myRedshift$con, "<NAME_OF_MY_TABLE> from 
's3://<S3_FOLDER_NAME>/<DATABASE_NAME.csv' 
CREDENTIALS 'aws_access_key_id=<MY_KEY>;aws_secret_access_key 
<MY_SA_KEY>'            
       region '<S3_REGION>'
       CSV DELIMITER ','
       NULL '#NULO'
       IGNOREHEADER 1")

The error i get is the following;

Error in postgresqlExecStatement(conn, statement, ...) : RS-DBI driver: (could not Retrieve the result : ERROR: Load into table '' failed. Check 'stl_load_errors' system table for details. ) NULL Warning message: In postgresqlQuickSQL(conn, statement, ...) : Could not create executecopy from 's3://s3:///;aws_secret_access_key='
region '' CSV DELIMITER ',' NULL '#NULO' IGNOREHEADER 1

Thanks for any help. Best,

MCS
  • 1,071
  • 9
  • 23
  • See "question mark" in http://stackoverflow.com/questions/38363566/trouble-with-utf8-characters-what-i-see-is-not-what-i-stored If that does not lead you to the answer, do the debugging it suggests and add then info to your question. – Rick James Sep 16 '17 at 21:11
  • That helped. Thanks Rick! – MCS Oct 11 '17 at 16:56

0 Answers0