46

Has anyone ever come across this error: General error: 1390 Prepared statement contains too many placeholders

I just did an import via SequelPro of over 50,000 records and now when I go to view these records in my view (Laravel 4) I get General error: 1390 Prepared statement contains too many placeholders.

The below index() method in my AdminNotesController.php file is what is generating the query and rendering the view.

public function index()
{
    $created_at_value = Input::get('created_at_value');
    $note_types_value = Input::get('note_types_value');
    $contact_names_value = Input::get('contact_names_value');
    $user_names_value = Input::get('user_names_value');
    $account_managers_value = Input::get('account_managers_value');

    if (is_null($created_at_value)) $created_at_value = DB::table('notes')->lists('created_at');
    if (is_null($note_types_value)) $note_types_value = DB::table('note_types')->lists('type');
    if (is_null($contact_names_value)) $contact_names_value = DB::table('contacts')->select(DB::raw('CONCAT(first_name," ",last_name) as cname'))->lists('cname');
    if (is_null($user_names_value)) $user_names_value = DB::table('users')->select(DB::raw('CONCAT(first_name," ",last_name) as uname'))->lists('uname');

    // In the view, there is a dropdown box, that allows the user to select the amount of records to show per page. Retrieve that value or set a default.
    $perPage = Input::get('perPage', 10);

    // This code retrieves the order from the session that has been selected by the user by clicking on a table column title. The value is placed in the session via the getOrder() method and is used later in the Eloquent query and joins.
    $order = Session::get('account.order', 'company_name.asc');
    $order = explode('.', $order);

    $notes_query = Note::leftJoin('note_types', 'note_types.id', '=', 'notes.note_type_id')
        ->leftJoin('users', 'users.id', '=', 'notes.user_id')
        ->leftJoin('contacts', 'contacts.id', '=', 'notes.contact_id')
        ->orderBy($order[0], $order[1])
        ->select(array('notes.*', DB::raw('notes.id as nid')));

    if (!empty($created_at_value)) $notes_query = $notes_query->whereIn('notes.created_at', $created_at_value);

    $notes = $notes_query->whereIn('note_types.type', $note_types_value)
        ->whereIn(DB::raw('CONCAT(contacts.first_name," ",contacts.last_name)'), $contact_names_value)
        ->whereIn(DB::raw('CONCAT(users.first_name," ",users.last_name)'), $user_names_value)
        ->paginate($perPage)->appends(array('created_at_value' => Input::get('created_at_value'), 'note_types_value' => Input::get('note_types_value'), 'contact_names_value' => Input::get('contact_names_value'), 'user_names_value' => Input::get('user_names_value')));

    $notes_trash = Note::onlyTrashed()
        ->leftJoin('note_types', 'note_types.id', '=', 'notes.note_type_id')
        ->leftJoin('users', 'users.id', '=', 'notes.user_id')
        ->leftJoin('contacts', 'contacts.id', '=', 'notes.contact_id')
        ->orderBy($order[0], $order[1])
        ->select(array('notes.*', DB::raw('notes.id as nid')))
        ->get();

    $this->layout->content = View::make('admin.notes.index', array(
        'notes'             => $notes,
        'created_at'        => DB::table('notes')->lists('created_at', 'created_at'),
        'note_types'        => DB::table('note_types')->lists('type', 'type'),
        'contacts'          => DB::table('contacts')->select(DB::raw('CONCAT(first_name," ",last_name) as cname'))->lists('cname', 'cname'),
        'accounts'          => Account::lists('company_name', 'company_name'),
        'users'             => DB::table('users')->select(DB::raw('CONCAT(first_name," ",last_name) as uname'))->lists('uname', 'uname'),
        'notes_trash'       => $notes_trash,
        'perPage'           => $perPage
    ));
}

Any advice would be appreciated. Thanks.

Antonio Carlos Ribeiro
  • 86,191
  • 22
  • 213
  • 204
Gareth Daine
  • 4,016
  • 5
  • 40
  • 67

8 Answers8

59

Solved this issue by using array_chunk function.

Here is the solution below:

foreach (array_chunk($data,1000) as $t)  
{
     DB::table('table_name')->insert($t); 
}

        
Regolith
  • 2,944
  • 9
  • 33
  • 50
Faridul Khan
  • 1,741
  • 1
  • 16
  • 27
36

There is limit 65,535 (2^16-1) place holders in MariaDB 5.5 which is supposed to have identical behaviour as MySQL 5.5.

Not sure if relevant, I tested it on PHP 5.5.12 using MySQLi / MySQLND.

Martin
  • 1,312
  • 1
  • 15
  • 18
  • 1
    This is the answer I was looking for! – Stéphane Feb 18 '22 at 14:20
  • I'm already chunking, but hit this error. I have now changed `$table['chunksize'] = floor(pow(2,16) / $table['cols']);` to `$table['chunksize'] = floor((pow(2,16) / $table['cols']) - 1);`. I'm using PHP PDO: `$start = $start * $chunksize;` followed by SQL `LIMIT {$start}, {$chunksize}`. Thanks for the -1 I missed, I just happened to hit %0 that one time. – MaKR Jun 07 '22 at 22:30
12

This error only happens when both of the following conditions are met:

  1. You are using the MySQL Native Driver (mysqlnd) and not the MySQL client library (libmysqlclient)
  2. You are not emulating prepares.

If you change either one of these factors, this error will not occur. However keep in mind that doing both of these is recommended either for performance or security issues, so I would not recommend this solution for anything but more of a one-time or temporary problem you are having. To prevent this error from occurring, the fix is as simple as:

$dbh->setAttribute(PDO::ATTR_EMULATE_PREPARES, true);
Mike
  • 23,542
  • 14
  • 76
  • 87
  • Thanks @mike, this is the answer I was looking for – alex Jun 21 '16 at 23:06
  • @Sadee Where and how the database configuration options are stored and how the connection is opened is different for every application. – Mike Apr 11 '17 at 20:11
  • @Mike yes you correct. and by that way (set option in config/database.php file) generates other issues in my app (Ex: in set parameters to queries). So, I removed that and set this attribute just before this big query. Any suggestion about that? I removed my previous comment. – Sadee Apr 12 '17 at 08:54
  • @Sadee You would probably be better to start a new question about that since it's a bit off topic for this one. Include your code and any errors you receive. – Mike Apr 12 '17 at 15:41
  • 1
    @Mike: after changing the attribute to 'true', the `error: 1390 Prepared statement contains too many placeholders` disappeared. But I've observed that INSERTs speed will then be much slower. Inserting 25.000 rows with 7 colums took 106 Seconds. After reducing the number of rows to 9200, the Insert took only approx. 15 seconds. As a result, I would say that one should keep an eye on the number of placeholders. – Peter Sep 20 '17 at 08:50
  • @Peter That's because PHP is emulating the prepares, so this is not as efficient as doing them natively with MySQL. As I mentioned in the answer, enabling emulated prepares is **not** recommended to remain enabled as a default setting. Typically when importing many thousands of rows, this is a one-time thing, not really something you're going to be doing every day, so performance is really not an issue. If it is, you may want to break the query up to limit the number of placeholders. – Mike Sep 20 '17 at 21:15
9

While I think @The Disintegrator is correct about the placeholders being limited. I would not run 1 query per record.

I have a query that worked fine until I added one more column and now I have 72k placeholders and I get this error. However, that 72k is made up of 9000 rows with 8 columns. Running this query 1 record at a time would take days. (I'm trying to import AdWords data into a DB and it would literally take more than 24 hours to import a days worth of data if I did it 1 record at a time. I tried that first.)

What I would recommend is something of a hack. First either dynamically determine the max number of placeholders you want to allow - i.e. 60k to be safe. Use this number to determine, based on the number of columns, how many complete records you can import/return at once. Create the full array of data for you query. Use a array_chunk and a foreach loop to grab everything you want in the minimum number of queries. Like this:

$maxRecords = 1000;
$sql = 'SELECT * FROM ...';
$qMarks = array_fill(0, $maxInsert, '(?, ...)');
$tmp = $sql . $implode(', ', $qMarks);
foreach (array_chunk($data, $maxRecords) AS $junk=>$dataArray) {
  if (count($dataArray) < $maxRecords)) { break; }

  // Do your PDO stuff here using $tmp as you SQL statement with all those placeholders - the ?s
}

// Now insert all the leftovers with basically the same code as above except accounting for
// the fact that you have fewer than $maxRecords now.
Gabe Spradlin
  • 1,937
  • 4
  • 23
  • 47
2

Using Laravel model, copy all 11000 records from sqlite database to mysql database in few seconds. Chunk data array to 500 records:

public function handle(): void
{
    $smodel = new Src_model();
    $smodel->setTable($this->argument('fromtable'));
    $smodel->setConnection('default'); // sqlite database
    $src = $smodel::all()->toArray();

    $dmodel = new Dst_model();
    $dmodel->setTable($this->argument('totable'));
    $dmodel->timestamps = false;
    $stack = $dmodel->getFields();
    $fields = array_shift($stack);

    $condb = DB::connection('mysql');
    $condb->beginTransaction();

    $dmodel::query()->truncate();
    $dmodel->fillable($stack);
    $srcarr=array_chunk($src,500);
    $isOK=true;
    foreach($srcarr as $item) {
        if (!$dmodel->query()->insert($item)) $isOK=false;
    }
    if ($isOK) {
        $this->notify("Przenieśliśmy tabelę z tabeli : {$this->argument('fromtable')} do tabeli: {$this->argument('totable')}", 'Będzie świeża jak nigdy!');
        $condb->commit();
    }
    else $condb->rollBack();

}
2

You can do it with array_chunk function, like this:

foreach(array_chunk($data, 1000) as $key => $smallerArray) {
        foreach ($smallerArray as $index => $value) {
                $temp[$index] = $value
        }
        DB::table('table_name')->insert(temp);
    }
Jaskaran Singh
  • 2,392
  • 24
  • 39
1

My Fix for above issue: On my side when i got this error I fixed it by reducing the the bulk insertion chunk size from 1000 to 800 and it worked for me. Actually there were too many fields in my table and most them contains the details descriptions of size like a complete page text. when i go for there bulk insertion the service caused crashed and through the above error.

Farid Abbas
  • 294
  • 4
  • 5
-3

I think the number of placeholders is limited to 65536 per query (at least in older mysql versions).

I really can't discern what this piece of code is generating. But if it's a gigantic query, There's your problem.

You should generate one query per record to import and put those into a transaction.

piet.t
  • 11,718
  • 21
  • 43
  • 52
The Disintegrator
  • 4,147
  • 9
  • 35
  • 43
  • 1
    An explanation for the downvote would be nice. I'm pretty sure that the problem is that the OP was hitting the placeholders limit – The Disintegrator Dec 12 '13 at 15:39
  • 16
    I assume this is the reason you've been downvoted: "You should generate one query per record" – VMC Jun 30 '15 at 14:20
  • 1
    "You should generate one query per record" is slow in my case.One query per record is slower than use up all that 65536 placeholders. Sql parse and transaction stuff will cost your CPU in your case. And network rrt will cost a lot of time too. – bronze man Sep 22 '17 at 09:34