I need to import a large SQL table from an Oracle database to a mySQL database. I "dumped" the Oracle table using RazorSQL so that I now have a large (2.5 GB) .sql file, with many INSERT
statements.
This file is too big and I want to split it so that I import it in chunks. What would be a clever way to split such a file?
using the
split
command would work except some string values contains newlines characters so that there is a high risk ofsplit
splitting mid-statement.a Perl quickie using a regexp to split on ";\n" (end of statement) would work, but there is still the remote possibility that such a character sequence could be found in the data.
Of course I can always write a parser to understand string quoting. I am wondering though whether there is anything smarter?
Similar questions have been asked. eg:
How do I split the output from mysqldump into smaller files?
But the only splitting solution has been a suggestion for some Windows software. Windows is not an option for me. I run Unix, namely Mac OS X.
To give some background, the reason I want to do that is that I have an error when importing the file into mySQL. Localizing the error is painful if only because it's time consuming, and because opening such large text files for editing is non trivial at best with most text editors. Some of my SQL tools don't even report the location of the error in the file, which would strongly suggest a dichotomy approach to zeroing on to the error by splitting number of times. Automatizing this would be useful.
Thanks for any idea.