3
$html = file_get_html('http://www.oddsshark.com/mlb/odds');

echo $html;

When ehcoed, the error message in the title of this question appears? I've had problems that are similar to this before. In all cases, I didn't actually need to increase the memoery in php.ini. Rather, there was a missing curly bracket that was needed to close a loop. This page that I'm requesting via the file_get_html function appears fine in my browser, but it just won't let me echo the html via php.

Any ideas?

hakre
  • 193,403
  • 52
  • 435
  • 836
Lance
  • 4,736
  • 16
  • 53
  • 90

4 Answers4

0

increase your memory limit in php.ini file

search for

; Maximum amount of memory a script may consume (128MB)
; http://php.net/memory-limit
memory_limit = 128M

in your php.ini file and increase it to 512M

Manish Jangir
  • 505
  • 3
  • 9
  • 1
    If it takes more than 33MB to load a 300kb HTML file, there's a problem that's bigger than just a memory limit. – Niet the Dark Absol May 18 '13 at 18:37
  • It can also be done by setting ini_set('memory_limit', '128M') directly in PHP file. – Deniss Kozlovs May 18 '13 at 18:37
  • The problem has probably to do with the internal processing of the HTML so either fixing the code is an option or increasing the limit. – jabbink May 18 '13 at 18:38
  • Unless you modify the PHP Simple HTML DOM Parser library, it gives up if the file is more than 600KB, to avoid using too much memory. – Barmar May 18 '13 at 18:40
  • 1
    @Kolink: The problem is the library. And you can't expect to get it actually fixed because it's superfluous in times of DOMDocument and friends. So suggesting to increase the memory limit - even if not very creative - actually sounds not like the worst suggestion here. The alternative is to replace the library (and I would go with that normally). – hakre May 18 '13 at 18:45
0

Why not use a library that is more memory optimized, simple HTML DOM is just not necessary any longer:

$html = new DOMDocument;
$html->loadHTMLFile('http://www.oddsshark.com/mlb/odds');
echo $html->saveHTML();

More suggestions are available with the reference question about that topic:

Community
  • 1
  • 1
hakre
  • 193,403
  • 52
  • 435
  • 836
  • '-­>:' ... the : is a typo, right ? This is a good answer, but I don't see the use for using DOMDocument just to print the raw source of an HTTP request, look like it will generate a useless object and involve useless processing... – FMaz008 May 18 '13 at 18:53
  • @FMaz008: Yes, the colon `:` is a typo. Sure :) - Fixed + Thx. - For the example: This is the example the user does in the question, so I actually did not judge about how high- or low-level it is, just showing that other libraries are able to read a document into memory without triggering the memory limit. – hakre May 18 '13 at 21:25
0

I really don't think this usage justify increasing the memory. If it go over, it's because there is a problem : Simple HTML DOM Parser is known to suffer from memory leak.

If you just need to retrieve the content of a remote page using HTTP, just do the following. This is the simplest and most resource efficient way I know to retrieve content from a remote page:

<?php
$homepage = file_get_contents('http://www.example.com/');
echo $homepage;
?>

If you need to do more advance queries, you may look into cURL: http://php.net/manual/en/book.curl.php

FMaz008
  • 11,161
  • 19
  • 68
  • 100
-2

Edit your php.ini file

memory_limit=512M

OR add a line in your PHP file:

ini_set('memory_limit', '512M');

And the error will get resolved. Note: You can put your value instead of 512M.

Pupil
  • 23,834
  • 6
  • 44
  • 66