2

ERROR i'm getting :

This page isn’t working didn’t send any data. ERR_EMPTY_RESPONSE

I am using PHP language for reading the csv file .

My PHP approach is look like for procession the csv data :

$csvAsArray = array_map('str_getcsv', file($tmpName));

I am sure that the above code creating problem afterwords the code is not getting executing . How i can import more that greater than at least 1 million data at a time ? Can anyone help me out , which approach should i choose ?

  • Not sure if it's a duplicate of https://stackoverflow.com/questions/9139202/how-to-parse-a-csv-file-using-php but it may help. – Nigel Ren Dec 19 '18 at 07:13

2 Answers2

1

It looks like you're trying to grab the entire contents of the file in one gulp. Don't do that :) PHP array_map() isn't scalable to 1000's ... or millions of lines.

SUGGESTION:

  1. Read your data into a temp file (as you appear to be doing now).

  2. Do a Postgresql COPY

EXAMPLE:

COPY my_table(my_columns, ...) 
FROM 'my_file.csv' DELIMITER ',' CSV HEADER;
paulsm4
  • 114,292
  • 17
  • 138
  • 190
  • 1
    Thanks @paulsm4 for giving the answers . but the problem is that i have to translate all the csv data to DB data for eg . (user_name to user_id ) if use_name is present in DB (type of foreign key) . that what i have to iteration the csv file line by line . So there any other way to do that – gaurav parkash Dec 19 '18 at 08:50
  • copy into a temporary table. – Jasen Dec 19 '18 at 09:29
0

I would suggest using a league/csv package for CSV parsing and importing. @paulsm4 is correct that it's never needed to put the whole file into memory and then work with it, one should rather read line-by-line. But this package is well-maintained and does this all under the hood and quite effectively. And it is much more flexible than COPY postgres command to my mind. One can filter the contents, map callbacks to fields/rows and all this on PHP side.

Alexey
  • 3,414
  • 7
  • 26
  • 44