2

Based on the accepted answer here, I am able to export the results of mysql query to a csv file on my Amazon EC2 instance using the following:

mysql -user -pass -e "SELECT * FROM table" > /data.csv

However, as the file exported is large, I want to export an Amazon s3-bucket (s3:\\mybucket) which is accessible from my EC2 instance

I tried:

mysql -user -pass -e "SELECT * FROM table" > s3:\\mybucket\data.csv

But it doesn't export the file.

mallet
  • 2,454
  • 3
  • 37
  • 64

2 Answers2

1

If you want to use the mysql command line program, then you have two choices:

  • Increase the size of your instance's storage so that the file can be created. Then copy the file to S3
  • Create a separate program or script that reads from Standard Input and writes to S3.

Another solution would be to create a simple program that processes your SELECT and directly writes to S3. There are lots of examples of this on the Internet in Python, Java, etc.

John Hanley
  • 74,467
  • 6
  • 95
  • 159
0

In addition to the accepted answer, if you are using the Aurora, then you can do

SELECT * FROM table INTO OUTFILE S3 's3:\\mybucket\table-data';

https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/AuroraMySQL.Integrating.SaveIntoS3.html

Alternate approach,

mysqldump -h [db_hostname] -u [db_user] -p[db_passwd] [databasename] [tablename] | aws s3 cp - s3://[s3_bucketname]/[mysqldump_filename]

it'll directly store the file to s3 without occupying space

Ravi Hirani
  • 6,511
  • 1
  • 27
  • 42