0

My web application lets user import an excel file and writes the data from the file into the mysql database. The problem is, when the excel file has lots of entries, even 1000 rows, i get an error saying PHP ran out of memory. This occurs while reading the file. I have assigned 1024MB to PHP in the php.ini file.

My question is, how to go about importing such large data in PHP.

I am using CodeIgniter. for reading the excel file, i am using this library.


SOLVED. I used CSV instead of xls. and I could import 10,000 rows of data within seconds. Thank you all for your help.

pnuts
  • 58,317
  • 11
  • 87
  • 139
varun1505
  • 800
  • 2
  • 8
  • 16
  • Increase the memory limit? Or, even better, don't handle large sets of data in PHP -- it isn't what it's designed for. – lonesomeday Sep 17 '12 at 21:56
  • php's not memory efficient when dealing with large data structures. and excel handling libs have to keep the whole spreadsheet in memory. increase the memory limit, or decrease the amount of data being dealt with. – Marc B Sep 17 '12 at 21:57
  • 1
    1GB of memory for PHP is insane..., but this question is probably best posed towards the lib in question. If you have to use that lib for your needs then handling large data sets isn't your only problem. – aknosis Sep 17 '12 at 21:57
  • 2
    i wouldn't call 1000 rows a lot, imported a 60k row csv file a few minutes ago –  Sep 17 '12 at 21:57
  • Increasing the memory limit didn't help. I had tried that. – varun1505 Sep 17 '12 at 21:57
  • 3
    I would start by saving it off, then using the fopen and fread to parse it in segments. Otherwise, setup a separate service dedicated to importing and design a queue system that the service can read and write to where it can poll for jobs and save back the status. – Brad Christie Sep 17 '12 at 21:57
  • maybe this will help: http://stackoverflow.com/questions/4666746/how-to-read-large-worksheets-from-large-excel-files-27mb-with-phpexcel – galchen Sep 17 '12 at 21:59
  • 1
    Are you using PHPExcel (you've tagged this as though you are) or are you using php-excel-reader? – Mark Baker Sep 17 '12 at 22:22
  • There's also a chance you have a loop that's never ending. Can you post some of your code that you're using to read from the excel sheet? Every time I've seen an out of memory error this has typically been the reason – DaOgre Sep 17 '12 at 23:51

2 Answers2

3

As others have said, 1000 records is not much. Make sure you process the records one at a time, or a few at a time, and that the variables you use for each iteration go out of scope after you're finished with that row or you're reusing the variables.

If you can avoid the necessity of processing excel files by exporting them to csv, that's even greater, cause then you wouldn't need such a library (which might or might not have its own memory issues).

Don't be afraid of increasing memory usage if you need to and that solves the problem, buying memory is the cheapest option sometimes. And don't let the 1 GB scare you, it is a lot for such a simple task, but if you have the memory and that's all you need to do, then its good enough for the moment.

And as a plus, if you are using an old version of PHP, try updating to PHP 5.4 which handles memory much better than its predecessors.

fd8s0
  • 1,897
  • 1
  • 15
  • 29
1

Instead of inserting one a time in a loop. Insert 100 row at a time.

You can always run

INSERT INTO myTable (clo1, col2, col2) VALUES 
(val1, val2), (val3, val4), (val5, val6) ......

This way number of network transaction will reduce thus reducing resource usage.

Rahul Prasad
  • 8,074
  • 8
  • 43
  • 49