0

I'm trying to upload 10k+ data in my database (using PHP and Mysql). However the file seems to be too large. If I reduce the size of the file to about 9300 records it works just fine but above that my code seems to fail at the following line $excelReader = PHPExcel_IOFactory::createReaderForFile($tmpfname); I'm including only a snippet of my code (where I think is relevent to this question). I have the upload_max_filesize=124M and post_max_size=124 as well. What could be the problem?

<?php
        $tmpfname = $_FILES['file_save']['tmp_name'];
    echo "0";   
        $excelReader = PHPExcel_IOFactory::createReaderForFile($tmpfname);
echo "1";
        $excelObj = $excelReader->load($tmpfname);
        echo "2";
        $worksheet = $excelObj->getSheet(0);
        echo "3";
        $lastRow = $worksheet->getHighestRow();

    ?>
Bobby
  • 496
  • 5
  • 18
  • Question is what error you get. Take a look into your http servers error log file. Most likely you run out of memory. Or maybe out of the configured time limit. – arkascha Oct 04 '16 at 13:02
  • @arkascha Sorry if I come off ignorant but where is my error log file located? – Bobby Oct 04 '16 at 13:04
  • I hope below link using for you. http://stackoverflow.com/questions/23869743/how-to-import-large-excel-file-to-mysql-database-using-php Thank you – Ragu Natarajan Oct 04 '16 at 13:06
  • Not ignorant at all! A bit lazy maybe ;-) That is documented and also answered here on SO: http://stackoverflow.com/questions/5127838/where-does-php-store-the-error-log-php5-apache-fastcgi-cpanel – arkascha Oct 04 '16 at 13:15
  • @RaguNatarajan sorry but looked at that already and didn't solve it. Thank you – Bobby Oct 04 '16 at 13:15
  • @arkascha assuming I'm running out of memory, how can I solve this issue? thanks – Bobby Oct 04 '16 at 14:08
  • Well, 1. by raising the memory limit configured for the execution of requests, 2. by reducing the amount of memory actually used. The later often is the much more intelligent approach. Potential gains you can try to find especially include parts where things are processed without freeing memory in between, so sections of code that do not scale. Data processing loops are typical candidates for that. – arkascha Oct 04 '16 at 14:10

1 Answers1

0

follow this link : http://www.ozerov.de/bigdump/usage/

  1. Download and unzip bigdump.zip on your PC.
  2. Open bigdump.php in a text editor, adjust the database configuration and dump file encoding.
  3. Drop the old tables on the target database if your dump doesn’t contain “DROP TABLE” (use phpMyAdmin).
  4. Create the working directory (e.g. dump) on your web server
  5. Upload bigdump.php and the dump files (*.sql or *.gz) via FTP to the working directory (take care of TEXT mode upload for bigdump.php and dump.sql but BINARY mode for dump.gz if uploading from MS Windows).
  6. Run the bigdump.php from your web browser via URL like http://www.yourdomain.com/dump/bigdump.php.
  7. Now you can select the file to be imported from the listing of your working directory. Click “Start import” to start.
  8. BigDump will start every next import session automatically if JavaScript is enabled in your browser.
  9. Relax and wait for the script to finish. Do NOT close the browser window!
  10. IMPORTANT: Remove bigdump.php and your dump files from your web server.
Ravinder
  • 1
  • 1
  • Whilst this is probably possible the question is: _why_? Why not simply fix the issue at hand and upload and process the image? Why should one want to additionally install, configured and use an insecure FTP server? Why rely on some arbitrary external solution for something as simple as a file upload? – arkascha Oct 04 '16 at 13:16