2

I'm running the following command in Redshift:

myDB=> unload ('select * from (select * from myTable limit 2147483647);')
       to 's3://myBucket/'
       credentials 'aws_access_key_id=***;aws_secret_access_key=***';

Here is what I get back:

ERROR:  S3ServiceException:The bucket you are attempting to access must be addressed
        using the specified endpoint. Please send all future requests to this 
        endpoint.,Status 301,Error PermanentRedirect,Rid 85ACD9FFAFC5CE8F,
        ExtRid vsz4/0NdOAYbaJ48WYCnrYBCvuuL0cBTdcEN

DETAIL:  
-----------------------------------------------
error:  S3ServiceException:The bucket you are attempting to access must be addressed
        using the specified endpoint. Please send all future requests to this 
        endpoint.,Status 301,Error PermanentRedirect,Rid 85ACD9FFAFC5CE8F,
        ExtRid vsz4/0NdOAYbaJ48WYCnrYBCvuuL0cBTdcEN
code:      8001
context:   Listing bucket=myBucket prefix=
query:     0
location:  s3_unloader.cpp:181
process:   padbmaster [pid=19100]
-----------------------------------------------

Any thoughts? Or maybe ideas how to dump data from Redshift into MySQL or something similar?

eistrati
  • 2,314
  • 6
  • 26
  • 35

2 Answers2

5

The error message is returned when using path like syntax with a non US bucket. Create a new bucket in the same region as your redshift cluster and everything should work.

Tomasz Tybulewicz
  • 8,487
  • 3
  • 42
  • 44
  • 2
    To rephrase a little bit: Redshift cluster and S3 bucket must be in the same region :) – eistrati Feb 14 '14 at 13:49
  • You can write to a bucket in a different region but you must specify that region when using UNLOAD. EX: unload ('select * from (select * from myTable limit 2147483647);') to 's3://myBucket/' credentials 'aws_access_key_id=***;aws_secret_access_key=***' region: 'us-west-2 Resource: https://docs.aws.amazon.com/redshift/latest/dg/r_UNLOAD.html#r_UNLOAD-synopsis – Sam Gruse Sep 24 '19 at 21:24
-1

You are missing the prefix part of the filename. Try using s3://myBucket/myPrefix

Tomasz Tybulewicz
  • 8,487
  • 3
  • 42
  • 44
  • Documentation also states that: The Amazon S3 bucket where Amazon Redshift will write the output files must reside in the same region as your cluster. – Tomasz Tybulewicz Feb 13 '14 at 21:51
  • The error message is returned when using path like syntax when using a non US s3 bucket. Create a new bucket in the same region as your redshift cluster and then everything should work. – Tomasz Tybulewicz Feb 13 '14 at 22:12
  • That's the correct answer! Redshift cluster was in Oregon, but S3 bucket in US Standard. Thanks a lot :) Can you please submit it separately and I'll accept it? – eistrati Feb 13 '14 at 22:19