3

The database

I'm working with a database that has pretty big tables and it's causing me problems. One in particular has more than 120k lines.

What I'm doing with it

I'm looping over this table in a MakeAverage.php file to merge them into about 1k lines in a new table in my database.

What doesn't work

Laravel doesn't allow me to process it all at once even if I try to DB::disableQueryLog() or or a take(1000) limit for example. It returns me a blank page every time even if my error reporting was enabled (kind of like this). Also, I had no Laravel log file for this. I had to look in my php_error.log (I'm using MAMP) to realize that it was actually a memory_limit problem.

What I did

I increased the amount of memory before executing my code by using ini_set('memory_limit', '512M'). (It's bad practice, I should do it in php.ini.)

What happened?

It worked! However, Laravel thrown me an error because the page didn't finished to load after 30s because of the large amount of data.

Wistar
  • 3,770
  • 4
  • 45
  • 70
  • It sounds like you aren't looping `$table` properly in `average.php`. Can you post the relevant code you have in there as well as the code you are using to create `$table`? – user1669496 Oct 31 '13 at 19:54
  • Check your error logs if you have not already - this may be an error instead of a memory limit. You *usually* get an error on memory limit issues instead of a blank page - I'm guessing error reporting is off? – fideloper Oct 31 '13 at 20:29
  • One small thing, but I don't think it's your entire issue... `Trialdata::all()->get()` will not work and should be `Trialdata::all()`. That should have thrown an error because it did for me. – user1669496 Oct 31 '13 at 20:34
  • @fideloper My error reporting is on. My log says: `exception 'ErrorException' with message 'Use of undefined constant memory_limit - assumed 'memory_limit''` – Wistar Oct 31 '13 at 20:38
  • @user1669496 Indeed, I made a mistake in writing the code here but my original file was fine. Even when I add `->get()` it does not throw me an error, just like if Laravel stops after reading `::all()`. Everything written after isn't processed. – Wistar Oct 31 '13 at 20:40
  • Interesting. Some googling confirms that others have gotten white screens on memory errors. I should mention that the only other time I've gotten white screens with no errors was when PHP attempted to use modules which weren't loaded (for example GD, curl, PDO) – fideloper Oct 31 '13 at 21:05
  • That doesn't sound like a problem with the actual memory limit. It sounds like you are trying to call a constant called `memory_limit` and it's undefined. Try defining `memory_limit` as a constant and see what happens. `define('memory_limit', '128M');` – user1669496 Oct 31 '13 at 21:15
  • @user1669496 When I define the memory limit, I come across the blank page but no error is logged at all... – Wistar Oct 31 '13 at 21:31

1 Answers1

4

What I will do

After spending some time on this issue and looking at other people having similar problems (see: Laravel forum, 19453595, 18775510 and 12443321), I thought that maybe PHP isn't the solution.

Since, I'm only creating a Table B from the average values of the Table A, I believe that a SQL is going to fits best my needs as it's clearly faster than PHP for that type of operation (see: 6449072) and I can use functions such as SUM, AVERAGE, COUNT and GROUP_BY (Reference).

Community
  • 1
  • 1
Wistar
  • 3,770
  • 4
  • 45
  • 70