9

I have no idea why I'm experiencing this bug.

I have the following script:

foreach($brands as $brand){ // about 600items for this loop
    ....
    ....
    DB::table('mailing_list')->insert(array(
                        'email'     => $email,
                        'source'    => $source,
                        'source_url'=> $brand->external_url,
                        'facebook_url'  => $facebook_url,
                        'instagram_id'  => $instagram_username,
                        'instagram_url' => $instagram_url,
                        'instagram_followers'   => $instagram_followers
                    ));
}

which breaks always at the same item

Error while sending STMT_PREPARE packet. PID=2 (SQL: insert into `mailing_list` (`email`, `source`, `source_url`, `facebook_url`, `instagram_id`, `instagram_url`, `instagram_followers`) values (some@email.com, source, www.url.com, https://www.facebook.com/url, some_username, http://instagram.com/url, 501))

Before I was getting 29 queries executed, now 34.

I would like to understand this error better: it might be the single entry breaking it, but even if the data I posted are gibberish the actual ones look fine to me.

WHAT I'VE TRIED:

set_time_limit(3600);
DB::connection()->disableQueryLog();
DB::reconnect(Config::get('database.default')); // this one after each foreach loop
clod986
  • 2,527
  • 6
  • 28
  • 52
  • How about playing with the `max_allowed_packet` on your mysql configuration? This is a warning right? – majidarif Apr 08 '14 at 09:23
  • @majimboo, I forgot to mention I'm on shared hosting... and from this question http://stackoverflow.com/questions/5688403/how-to-check-and-set-max-allowed-packet-mysql-variable it seems like I won't be able to change this value, but I'll give it a try – clod986 Apr 08 '14 at 09:27
  • also the insert query looks like missing single quote for string values u may need to check that. – Abhik Chakraborty Apr 08 '14 at 09:30
  • This seems to built by Laravels query builder. And yes it is missing those qoutes to signify a string. Might want to clean up you version of laravel. – majidarif Apr 08 '14 at 09:33
  • 1
    how would you check it? I can tell you that I get about 40 rows inserted correctly, I don't think that it's the issue... but I'll try to sanitize it anyway – clod986 Apr 08 '14 at 09:39

5 Answers5

8

The real reason and solution of the problem can be found in my blog post:

Laravel 4 and "Error while sending STMT_PREPARE packet"

It's impossible to describe the whole answer here but, long story short, there was a bug in Laravel 4's \Illuminate\Database\MySqlConnection class. Specifically, in its causedByLostConnection method which looked like this:

/**
 * Determine if the given exception was caused by a lost connection.
 *
 * @param  \Illuminate\Database\QueryException
 * @return bool
 */
protected function causedByLostConnection(QueryException $e)
{
    return str_contains($e->getPrevious()->getMessage(), 'server has gone away');
}

Unfortunately, "server has gone away" is not the only message indicating that a connection with MySQL was lost.

The same method in Laravel 5 checks for 6 additional messages and this solves the problem:

/**
* Determine if the given exception was caused by a lost connection.
*
* @param  \Exception  $e
* @return bool
*/
protected function causedByLostConnection(Exception $e)
{
    $message = $e->getMessage();

    return Str::contains($message, [
        'server has gone away',
        'no connection to the server',
        'Lost connection',
        'is dead or not enabled',
        'Error while sending',
        'decryption failed or bad record mac',
        'SSL connection has been closed unexpectedly',
    ]);
}
Armen Markossyan
  • 1,214
  • 1
  • 12
  • 25
  • It's been a while since I got this error... I don't have time to test your answer, but if you manage to get a few upvotes I'll accept this answer – clod986 Jan 28 '16 at 12:11
  • No problem. Just wanted to contribute in case someone else faces this issue. – Armen Markossyan Jan 28 '16 at 13:42
  • 3
    Unfortunately, I've been getting this error a bunch even with Laravel 5.7, so there must be some other root cause and solution. – Ryan Nov 22 '18 at 15:22
  • I guess that Laravel has introduced a new problem because I haven't received such kind of feedback for a couple of years and then suddenly people started reporting that my solution doesn't work. Unfortunately, I don't use Laravel anymore, so I'm not sure what can be the problem. – Armen Markossyan Jan 31 '19 at 21:00
3

I had this problem in a script that did the following operations in this order:

  1. Connect to local database
  2. Connect to a remote API via cURL
  3. Update local database

It seems to be that step 2 was sometimes taking too long to complete, which ended up with a timed out / broken connection to the database.

To fix, I simply changed the order of operations, so I connect to the database after the long-running operation, just before I do the work on the database:

  1. Connect to a remote API via cURL
  2. Connect to local database
  3. Update local database
ban-geoengineering
  • 18,324
  • 27
  • 171
  • 253
2

I've solved it by reducing the items passed to the foreach loop.

$all_brands = Brand::all();
$padding    = 0;
$batch      = 100;

while($all_brands->count() > $padding){
    $brands = Brand::orderBy('id', 'asc')->skip($padding)->take($batch)->get();
    foreach($brands as $brand){
        ....
        ....
        DB::table('mailing_list')->insert(array(
                        'email'     => $email,
                        'source'    => $source,
                        'source_url'=> $brand->external_url,
                        'facebook_url'  => $facebook_url,
                        'instagram_id'  => $instagram_username,
                        'instagram_url' => $instagram_url,
                        'instagram_followers'   => $instagram_followers
                    ));
    }
    $padding = $padding + $batch;
}
clod986
  • 2,527
  • 6
  • 28
  • 52
  • I know this was written a while ago and maybe it was not available at that time, but today, you can chunk results so you don't have to manage the batches yourself. See the doc related to chunks: https://laravel.com/docs/5.8/queries#chunking-results – Lunfel Aug 23 '19 at 06:28
1

I had the same issue few days ago, so the simplest solution to this problem was to make some changes to the database (mysql in my case) .cnf config file. The variable that required to be changed was wait_timeout. By default it is 30 seconds, but if you increase it, then you won't get this error. In my case I have set it to 10 minutes, and now my script works like charm.

What was happening is that the script would initialize, and sit and wait until all my API calls are finished before starting to parse each of them, in a for each loop also by the way. So until it got to the point when it required to do database query, it had already lost connection. Hope it helps!

rebatoma
  • 734
  • 1
  • 17
  • 32
0

The problem is definitely related to closed connection. There are number of reasons why this can happens.

In our case it was up to large packets of data sent to DB (we store blobs in DB). Increasing variable max_allowed_packet size from default setting of 4MB to bigger 32MB resolved an issue.

Nikola Nikolić
  • 356
  • 2
  • 17