I am pretty new to databases and need help. I have n (large) files, each file contains m (very large) text file (numeric data). What is the best way to import those files into a mysql database concerning the names of the fields?
-
have you tried: http://www.lullabot.com/blog/importexport-large-mysql-databases – Book Of Zeus Oct 30 '11 at 15:10
2 Answers
If you only need to do it once, or the import process remains fairly similar each time, I would recommend using the ETL software Kettle by Pentaho (this bit of software is commonly referred to as kettle). While this software is far from perfect, I've found that I can often import data in a fraction of the time I would have to spend writing a script for one specific file. You can select a text file input and specify the delimiters, fixed width, etc.. and then simply export directly into your SQL server (they support MySql, SQLite, Oracle, and much more).
If you would like to research other types of software like this, its often referred to as ETL software, short for Extract Transform Load.
If your familiar with python, I would also recommend the last post on this page

- 1
- 1

- 11
- 1
usually one would write script with perl (or whatever scripting language is preferred, offering MySQL Support ) and process files one after other, applying necessary processing to files / lines inside files. If you like more specific answer, ask more specific question

- 12,329
- 1
- 30
- 35