0

I have a big file that has about 11 Mb. It is a CSV file and I need to load the content of that file into a Postgres database.

I use a PHP script to do this job but always stop in some moment.

I put big size for PHP memory and other stuff and I could load more data but not all data.

How can I solve that? Is any cache memory that I need to clean? Some secret to manage big files in PHP?

Thanks in advance.

UPDATE: Add some code

$handler = fopen($fileName, "r");
$dbHandler = pg_connect($databaseConfig);

while (($line = $handler->fgetcsv(";")) !== false) {
    // Algorithms to transform data

    // Adding sql sentences in a variable

    // I am using a "batch" idea that execute all sql formed after 5000 read lines
    // When I reach 5000 read lines, execute my sql
    $results = pg_query($dbHandler, $sql);
}
Guido
  • 103
  • 12
  • 3
    The real question is, why would you want to load such a big file into the database? The best way to go around this is to simply store the path to the file in the DB, and then just use `fread` on the file system. – h2ooooooo Feb 04 '13 at 12:52
  • check this http://stackoverflow.com/questions/2987433/how-to-import-csv-file-data-into-a-postgres-table – Hawili Feb 04 '13 at 12:54
  • I must to migrate data to an old system to the new one. And the data are in that format. I must to transform and save in new system. – Guido Feb 04 '13 at 12:54
  • Okay, so to clarify you don't want to simply import the CSV file content byte for byte, but rather create a table structure fitting the CSV, and then filling out rows in the database with what's in the CSV file? – h2ooooooo Feb 04 '13 at 12:56
  • @Hawili But I need to do some transformations to the data before save in new database. – Guido Feb 04 '13 at 12:57
  • Yes @h2ooooooo! I need read file line by line and then save the modified data to my database – Guido Feb 04 '13 at 12:58
  • 2
    @Guido You say you have a PHP script to do this job - how are you reading data from the CSV file? Can you post the source code? (considering it takes up all your memory, you might be doing it in an absurd way). – h2ooooooo Feb 04 '13 at 13:00
  • making php do the migration is funny – Uğur Gümüşhan Feb 04 '13 at 13:10
  • As an update to this, have you considered lowering your buffer to 1000 instead of 5000 lines? – h2ooooooo Feb 05 '13 at 11:56
  • My file has 300.000 lines, if I worked by 1000 lines at a time, there was so slow – Guido Feb 05 '13 at 15:46

1 Answers1

1

In case you have direct access to the server(and you don't work with some subversion software), postgre has a far better option that is far less demanding in terms of resources. Keep in mind that php is a slow and resource consuming language

COPY my_table_name FROM '/home/myfile.csv' DELIMITERS ',' CSV
alxkls
  • 151
  • 8