0

I successfully receive data into server client account via _POST but I can't insert data into row table that holds longblob types when row data is above ~25MB. I cannot see any errors thrown by the the php prepared statements/file, php file closes gracefully. I'm checking whether my max_allowed_packet is correctly configured.

I don't have server root privileges and access, I can't see my.cnf, and I'm talking to my host to ensure both client and server max_allowed_packet are set at 256M.

php file insert section:

$itemAttributes=$_POST['itemAttributes'];
$itemName=$_POST['itemName'];

if(!($stmt = $conn->prepare("INSERT INTO $itemTable (`itemName`, `itemAttributes`) VALUES (?, ?)")))
        echo "<br/>Failed to Prepare";
        else
          echo "<br/>Prepare success.";  

$stmt->execute([$itemName,$itemAttributes]);
          echo " Success executing";

$conn->connection=null;
if($conn->connection==null)
  echo "<br />Connection Closed.......";

These are the checks I am doing, am I missing any to make sure the max_allowed_packet won't be overriden elsewhere and is setup in all places correctly?

Client setting check: All these three check give me 256MB max_allowed_packet:

mysql SHOW VARIABLES LIKE 'max_allowed_packet' 
mysql SHOW GLOBAL VARIABLES LIKE 'max_allowed_packet'
mysql SHOW SESSION VARIABLES LIKE 'max_allowed_packet'

However, I successfully login to mysql command prompt (mysql --user userName --password databaseName) to check max_allowed_packet but it shows NULL, what does that mean?

mysql> select @max_allowed_packet;
+---------------------+
| @max_allowed_packet |
+---------------------+
| NULL                |
+---------------------+
1 row in set (0.00 sec)

Server setting check: how can I check server 'max_allowed_packet' for GLOBAL and SESSION? I try the above replacing "mysql" for "mysqld" but I get no result and warnings I can't relate to the size of 'max_allowed_packet':

[useraccount@cloud ~]$ mysqld SHOW GLOBAL VARIABLES LIKE 'max_allowed_packet'
2020-02-12T07:47:33.832292Z 0 [Warning] Could not increase number of max_open_files to more than 1024 (request: 65536)
2020-02-12T07:47:33.832480Z 0 [Warning] Changed limits: max_connections: 214 (requested 256)
2020-02-12T07:47:33.832487Z 0 [Warning] Changed limits: table_open_cache: 400 (requested 2000)
2020-02-12T07:47:33.945335Z 0 [Warning] TIMESTAMP with implicit DEFAULT value is deprecated. Please use --explicit_defaults_for_timestamp server option (see documentation for more details).
2020-02-12T07:47:33.945476Z 0 [Warning] Can't create test file /var/lib/mysql/cloud.lower-test
2020-02-12T07:47:33.945569Z 0 [Note] mysqld (mysqld 5.7.29) starting as process 75 ...
2020-02-12T07:47:33.947615Z 0 [Warning] Can't create test file /var/lib/mysql/cloud.lower-test
2020-02-12T07:47:33.947631Z 0 [Warning] Can't create test file /var/lib/mysql/cloud.lower-test
2020-02-12T07:47:33.948025Z 0 [ERROR] Could not open file '/var/log/mysqld.log' for error logging: Permission denied
2020-02-12T07:47:33.948066Z 0 [ERROR] Aborting
2020-02-12T07:47:33.948138Z 0 [Note] Binlog end
2020-02-12T07:47:33.948287Z 0 [Note] mysqld: Shutdown complete

mysqldump check: Finally, I try to read max_allowed_packet for mysqldump with the three commands above (i.e. mysqldump SHOW VARIABLES LIKE 'max_allowed_packet' ) substituting "mysql" for "mysqldump" but I get access denied. As an alternative I then successfully login to mysqldump to read max_allowed_packet (mysqldump --user userName --password databaseName) and I get a lot of garbage scrolling through the screen so I can't get this value.

Paul Spiegel
  • 30,925
  • 5
  • 44
  • 53
Sergio Solorzano
  • 476
  • 9
  • 29
  • 1
    `select @@max_allowed_packet` – Paul Spiegel Feb 12 '20 at 08:19
  • thanks @PaulSpiegel , I get 256MB. If enter this command in mysql> prompt, is that client or server value? how do i enter this command to check client, server, mysqldump values? – Sergio Solorzano Feb 12 '20 at 08:21
  • I don't think a client has such setting. This is the max size of what the server accepts. "I cannot see any errors thrown by the the php prepared" - Make sure you configure PHP and PDO or mysqli to throw exceptions. Per default both fail silently on `execute()`. You should also post the relevant PHP code here. – Paul Spiegel Feb 12 '20 at 08:29
  • You may try to increase this value in your session by `SET SESSION max_allowed_packet = new_value;` statement executed immediately before inserting query (in the same connection strongly! the best way in the same command if it allows multi-queries), for example, try 512M. If the 25M limit won't change there is some another reason of your issue. Also you may divide the info to be inserted to 2-3 chuinks and insert it by 2-3 separate INSERTs. – Akina Feb 12 '20 at 08:31
  • @PaulSpiegel thanks, I've added php section, no errors thrown and echo confirm data reaches correctly to server, pdo statements correctly prepare/execute, i've postd php code if you have any other ideas. – Sergio Solorzano Feb 12 '20 at 08:42
  • @Akina thanks, set session in php file didn't solve it. Problem with breaking out insert statement for each variable inserted is that one of them in particular can be larger than 25Mb so it doesn't solve the issue unfortunately – Sergio Solorzano Feb 12 '20 at 08:52
  • @PaulSpiegel if it helps to clarify, here link doc says there is client and server max_allowed_packet size, i'm trying to ensure both and mysqldump are set correctly https://dev.mysql.com/doc/refman/8.0/en/packet-too-large.html – Sergio Solorzano Feb 12 '20 at 09:10
  • 1
    How do you know that `$stmt->execute(..)` was successful? I told you that it **fails silently** with default settings. Please read [this post](https://stackoverflow.com/a/32648423/5563083). Also add `var_dump($stmt->rowCount());` after `execute()`. If you get `1`, then the row is inserted. If you get `0`, then `execute()` failed silently and you need to get the error message. Without an error message we can sit and guess all the day long. – Paul Spiegel Feb 12 '20 at 09:53
  • hi @PaulSpiegel yes sorry, now I understand what you meant by silent. I've added ini_set('display_errors', 1); ini_set('display_startup_errors', 1);error_reporting(E_ALL); at top of php and after execute() trigger_error("PDO errorInfo: ".$conn->errorInfo()); When I upload data=12MB it populates db ok, vardump=1 and no errors echoed. If data=32MB does not populate db, vardump not shown, echos Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 32000179 bytes) in /home/client/public_html/phpfile.php on line 175. Line 175 has the execute pdo statement – Sergio Solorzano Feb 12 '20 at 11:44
  • 1
    This is a PHP error message. Check [memory_limit](https://www.php.net/manual/en/ini.core.php#ini.memory-limit). I guess you have copied the 32MB data multiple times in PHP - and then `execute()` makes a new copy. You might avoid that last copy process by using `bindParam()`, which will probably not make a copy. – Paul Spiegel Feb 12 '20 at 11:50

1 Answers1

1

As it turns out, this has nothing to with MySQL directly.

Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 32000179 bytes) in /home/client/public_html/phpfile.php on line 175

This is a PHP error message, which tells you that your script has excseeded the 128M memory limit.

You can increase the limit either in your config file or in your script with

ini_set('memory_limit','256M');

You can use -1 as value to disable the limit completely.

However - you should though avoid copying large amounts of data.

The following case is clear:

$itemAttributes=$_POST['itemAttributes'];

You copy 32M of data into a new variable. And your script is at least at 64M now.

The execute() method with parameters is more tricky, and I'm not sure if the following is exactly true: You pass a new array as parameter

[$itemName,$itemAttributes]

This array first needs to be created in memory, before it is passed to execute(). That again consumes at least 32M more. Then due to internal implementation (which I don't know) every array element is passed to something like bindValue(), which will again copy all the data. At this point your script is already at 128M limit (32*4).

So you should do something like the following:

Remove these lines:

$itemAttributes=$_POST['itemAttributes'];
$itemName=$_POST['itemName'];

Prepare the statement:

$stmt = $conn->prepare("INSERT INTO $itemTable (`itemName`, `itemAttributes`) VALUES (?, ?)");

Bind the parameters with bindParam:

$stmt->bindParam(1, $_POST['itemName'], PDO::PARAM_STR);
$stmt->bindParam(2, $_POST['itemAttributes'], PDO::PARAM_LOB);

bindParam() is using call by reference, which you can see in the description

public PDOStatement::bindParam ( mixed $parameter , mixed &$variable [, int $data_type = PDO::PARAM_STR [, int $length [, mixed $driver_options ]]] ) : bool

& in &$variable indicates, that no copy will be done, but a reference to the value is passed instead.

Note, that I am usually not a friend of call by reference, and avoid it when not dealing with resource critical code. execute([..]) is fine in most cases.

If you want to see, how much memory has been allocated, you can use memory_get_peak_usage() somewhere at the end of your script.

Paul Spiegel
  • 30,925
  • 5
  • 44
  • 53