1

Using mysqldump, I can dump my Databases and tables into separate files.

My question is, is there a way within mysqldump to split up these table files into smaller parts?

So instead of one 10GB .sql file, I would get ten 1GB .sql files for the same table?

Jason Macgowan
  • 524
  • 7
  • 15
  • 1
    duplicate with http://stackoverflow.com/questions/132902/how-do-i-split-the-output-from-mysqldump-into-smaller-files – miah Mar 27 '13 at 14:57

2 Answers2

2

You can use MySQL Dump to grab data from a query however I've always found this hard to manage when you need split data into a specific size chunk.

As you want 1Gb files, here is how I would split the table up into 1Gb segments.

I've used INTO OUTFILE however MySQL dump could also be used at this stage

SELECT * FROM table
ORDER BY adminid ASC
INTO OUTFILE 'c:/table.csv'
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY

If you're using windows this really lacks a good split utility so I would suggest the GNU Core Utilities bundle http://gnuwin32.sourceforge.net/packages/coreutils.htm

After installing you can use split from the command prompt

cd C:\Program Files (x86)\GnuWin32\bin
split -C 1024m -d c:\table.csv c:\table.part_

If you're using Linux you've already got access to a good split util.

If you export them you will probably want to import them again at some point - that is where the .part_ at the end of the line is important, as mysqlimport tries to figure out the table name to import to, the . can be used to split the table but allow multiple files to import to the same database table.

These can then be imported using

mysqlimport --local --compress --user=username --password=password --host=dbserver.host --fields-terminated-by=, --fields-optionally-enclosed-by="\"" --lines-terminated-by="\n" databasename c:\table.csv

--local is needed otherwise mysqlimport wants to find the files on the remote host

--compress is vital as it saves a lot of bandwidth

Steve
  • 3,673
  • 1
  • 19
  • 24
  • The real answer is no, you can't. Sure I can use "split", but that's not part of mysql dump. Accepted answer because this is the best alternative – Jason Macgowan Mar 29 '13 at 18:53
1

You can split @jason the full dump into tables and databases. You can use mysql-dump-splitter to extract table / database of your choice. Also during your dump process you can use filters as follows:

Dump all data for 2015:
mysqldump --all-databases --where= "DATEFIELD >= '2014-01-01' and DATEFIELD < '2015-01-01' " | gzip > ALLDUMP.2015.sql.gz

Provided you should have DATEFIELD column in all tables!! Alternatively you can also specify the ID column to restrict dump to only extract IDs of specified range.

Alec.
  • 5,371
  • 5
  • 34
  • 69
MySQL_user
  • 11
  • 1