2

The company i work for uses Kayako to manage its support tickets. I was set to make an external webapp that takes all the tickets from one company and displays the history.

Im using mysqli_query to connect to the db.

$link = mysqli_connect($serverName, $userName, $password, $dbName);
$sql = "SELECT ticketid, fullname, creator, contents FROM swticketposts";
$result = mysqli_query($link, $sql) or die(mysqli::$error);

The problem is that the "contents" table in mySQL uses the datatype LONGTEXT. Trying to read this data with php gives me either timeout or max memory usage errors.

Line 49 is the $result = mysqli_query etc line.

EDIT: Posted wrong error msg in original post Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 8192 bytes)

Trying to work around the memory problem i added:

ini_set('memory_limit', '-1');

which gave me this timeout error instead:

Fatal error: Maximum execution time of 30 seconds exceeded in C:\xampp\htdocs\test\php\TicketCall.php on line 49

Thing is, when we view the tickets on the kayako platform, it reads the same contents table and does it instantly, so there must be a way to read these longtext's faster. How is beyond me.

Solutions i can't use:

  • Change the datatype to something smaller (would break our kayako system)

TLDR; Is there a way to read from longtext data types without killing memory and getting timout errors using php and mysql.

Snigern
  • 53
  • 1
  • 7

1 Answers1

4

First of all, you might consider changing your query to something like this:

 SELECT ticketid, fullname, creator, contents FROM swticketposts WHERE customer = 'something'

That will make your query return less data by filtering out the information from customers your report doesn't care about. It may save execution time on your query.

Second, you may wish to use

 SUBSTRING(contents, 1, 1000) AS contents

in place of contents in your query. This will get you back part of the contents column (the first thousand characters); it may (or may not) be good enough for what you're trying to do.

Third, mysqli_ generally uses buffered querying. That means the entire result set gets slurped into your program's memory when you run the query. That's probably why your memory is blowing out. Read this. http://php.net/manual/en/mysqlinfo.concepts.buffering.php

You can arrange to handle your query row-by-row doing this:

 $unbufResult = mysqli_query($link, $sql, MYSQLI_USE_RESULT) or trigger_error($link->error);
 if ($unbufResult ) {
   while ($row = $unbufResult ->fetch_assoc()) {
     /* deal with the data from one row */
   }
 }
 $unbufResult->close();

You need to be careful to use close() when you're done with these unbuffered result sets, because they use resources on your MySQL server until you do. This will probably prevent your memory blowout. (There's nothing special about using fetch_assoc() here; you can fetch each row with any method you wish.)

Fourth, php's memory and time limits are usually tuned for interactive web site operation. Sometimes report generation takes a long time. Put calls to set_time_limit() in and around your loop, something like this.

 set_time_limit (300); /* five minutes */
 $unbufResult = mysqli_query($link, $sql, MYSQLI_USE_RESULT) or trigger_error($link->error);
 if ($unbufResult ) {
   while ($row = $unbufResult ->fetch_assoc()) {
     set_time_limit (30);  /* reset time limit on each row */
     /* deal with the data from one row */
   }
 }
 $unbufResult->close();
 set_time_limit (300); /* set time limit back to five minutes when done reading */
Your Common Sense
  • 156,878
  • 40
  • 214
  • 345
O. Jones
  • 103,626
  • 17
  • 118
  • 172
  • Thanks, everything worked like a charm. there are still some issues with my code, but its not related to the issues i posted here, so thanks a lot for the help. I still haven't implemented the final part of your code with the set_time_limit, but the time limits haven't been an issue since i added the other parts of your code. – Snigern Nov 21 '17 at 14:31