15

I am developing code to generate PDF from HTML code using library MPDF. For HTML Code I am reading from external HTML file. But its not working for larger HTML code size. Is there any way to fix it or do we have any other library which supports my functionality.

For larger html file giving error:

Fatal error: Uncaught Mpdf\MpdfException: The HTML code size is larger than pcre.backtrack_limit 1000000. You should use WriteHTML() with smaller string lengths.

James Z
  • 12,209
  • 10
  • 24
  • 44
Satya Mahesh
  • 339
  • 1
  • 3
  • 14
  • You might look at the HTML in a text editor -- perhaps you can pre-process the HTML to strip out parts that are not needed before you feed it to mpdf. If not, perhaps you can partition it into smaller pieces somehow. – Dave S Mar 05 '18 at 21:17

4 Answers4

30

The error message tells you what to do. Pass your HTML to WriteHTML() method in smaller chunks.

Or you can try to increase your backtrack limit even more:

ini_set("pcre.backtrack_limit", "5000000");

https://mpdf.github.io/troubleshooting/known-issues.html#blank-pages-or-some-sections-missing

Finwe
  • 6,372
  • 2
  • 29
  • 44
15

I faced the same problem when exporting a big document. I solved it by dividing the html I am exporting into smalled pieces as Finwe suggested. This is the pseudo-code:

On the html file I want to export I insert a marker that I will later use to split the html. In this case I use the word chunk:

<body>
@if (count($doc_items) > 0)
@foreach($doc_items as $item)
chunk
<div>
     {{-- Item text here --}}
</div>
</body>

On the controller side, I modify the following parameters just in case, although it might be not neccessary depending on the export size:

ini_set('max_execution_time', '300');
ini_set("pcre.backtrack_limit", "5000000");

and then I process the html:

$chunks = explode("chunk", $html);
foreach($chunks as $key => $val) {
    $mpdf->WriteHTML($val);
}
AdriRomas
  • 201
  • 3
  • 9
  • This is Genius, I gotta try that..! Thanks – raphjutras Apr 09 '20 at 13:10
  • @AdriRomas how did you assign the page's html to `$html`? – Bennett Sep 28 '20 at 19:35
  • 1
    @Bennett Hello. In my case (I'm using Laravel php framework) it is done with the render function like this: $html = view('documents.exportView', compact('document', 'doc_items'))->render(); – AdriRomas Sep 30 '20 at 06:39
  • 2
    Use `` as marker as it wont be rendered in html – Jerem Mar 15 '22 at 12:23
  • I tried to split a big report with pictures as explained in this very good answer. It turned out the pictures were the problem, as they were base64 encoded text inside the HTML and where 1.4m characters long. I tried to split the base64 code, but `writeHtml()` does not combine chunks like this, it needs valid html and it added my base64 pieces as visible text. So, in my case, there was no other way than increasing `memory_limit` and `pcre.backtrack_limit`. It is however easy and in my case effective. – peter_the_oak Dec 07 '22 at 15:55
3

You are giving a very large html code. It won't be accepted. See https://mpdf.github.io/troubleshooting/known-issues.html#blank-pages-or-some-sections-missing

Rafael Xavier
  • 956
  • 13
  • 13
1
ini_set('memory_limit', '900000M');
ini_set("pcre.backtrack_limit", "2000000");

good afternoon, I solved the problem like this, editing these two variables, together with the quantity that I am supplying, in this way it worked for a file of 49 pages with 3351 rows, if the file is heavier, both amounts must be increased, example, in the scale presented above if the file is 59 pages 4351 rows the code would be

 ini_set('memory_limit', '1500000M');
 ini_set("pcre.backtrack_limit", "3000000");

and so on, depending on the size of the file until they don't get an error.

It happens that it gave me an error and I was able to solve it like this, in a simple way. Best regards

  • Please translate your answer to english, such that everyone here can understand it. Also, please add some explanation to your answer - modifying the limit has already been suggested in another answer – Nico Haase Jul 26 '22 at 18:23
  • good afternoon, yes, the modification of pcre.backtrack_limit but not of memory_limit was suggested, the two combined together with the amount that I am supplying served me for a file of 49 pages with 3351 rows, if the file is heavier both should be increased amounts. It happens that it gave me an error and I could solve it just like that, in a simple way. Best regards – Duber Pesca - un Cristiano Jul 26 '22 at 18:29
  • Please add **all** clarification to your answer by editing it – Nico Haase Jul 26 '22 at 18:34
  • ok tanks, I already edited it (Y), Until another chance – Duber Pesca - un Cristiano Jul 26 '22 at 18:47
  • The given configuration sounds like a pretty bad idea. A memory limit of `1500000M` equals to `1500G`, and I doubt any **good** PHP application needs that much memory – Nico Haase Jul 26 '22 at 19:46
  • ok well it worked for me everyone will see what he does – Duber Pesca - un Cristiano Jul 26 '22 at 21:06