You can use MySQL Dump to grab data from a query however I've always found this hard to manage when you need split data into a specific size chunk.
As you want 1Gb files, here is how I would split the table up into 1Gb segments.
I've used INTO OUTFILE
however MySQL dump could also be used at this stage
SELECT * FROM table
ORDER BY adminid ASC
INTO OUTFILE 'c:/table.csv'
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY
If you're using windows this really lacks a good split utility so I would suggest the GNU Core Utilities bundle http://gnuwin32.sourceforge.net/packages/coreutils.htm
After installing you can use split
from the command prompt
cd C:\Program Files (x86)\GnuWin32\bin
split -C 1024m -d c:\table.csv c:\table.part_
If you're using Linux you've already got access to a good split util.
If you export them you will probably want to import them again at some point - that is where the .part_ at the end of the line is important, as mysqlimport tries to figure out the table name to import to, the . can be used to split the table but allow multiple files to import to the same database table.
These can then be imported using
mysqlimport --local --compress --user=username --password=password --host=dbserver.host --fields-terminated-by=, --fields-optionally-enclosed-by="\"" --lines-terminated-by="\n" databasename c:\table.csv
--local
is needed otherwise mysqlimport wants to find the files on the remote host
--compress
is vital as it saves a lot of bandwidth