31

I'm building a RESTS like service in PHP that should accept a large JSON post as main data (I send and read the data much like discussed here: http://forums.laravel.io/viewtopic.php?id=900)

The problem is that PHP gives me the following Warning:

<b>Warning</b>:  Unknown: Input variables exceeded 1000. To increase the limit change max_input_vars in php.ini. in <b>Unknown</b> on line <b>0</b><br />

Is there any way to get PHP not count input variables (or should I just surpress startup warnings)?

George Cummins
  • 28,485
  • 8
  • 71
  • 90
Fredrik Erlandsson
  • 1,279
  • 1
  • 13
  • 22
  • 4
    Why not change `max_input_vars` as the warning recommends? – George Cummins May 09 '13 at 20:31
  • 1
    Because I don't like the idea of changing it, my posted JSON data could be very large. I don't need help from PHP to parse my variables as I don't send any variables, just JSON encoded data. – Fredrik Erlandsson May 09 '13 at 20:34
  • 2
    PHP doesn't care that you are using JSON. The data is counted as input variables so the warning is triggered. What downside do you see in using the configuration option as it was intended? – George Cummins May 09 '13 at 20:38
  • the laravel link doesn't work, can you post some sample code? – chugadie Mar 17 '16 at 12:50
  • 8
    Post the JSON as a string with `JSON.stringify()` instead and use the PHP function `json_decode` on the receiving side to turn the data into an array. – m13r Nov 21 '17 at 15:04

12 Answers12

23

Change this

; max_input_vars = 1000

To this

max_input_vars = 3000

Comment line in php.ini file is

;
Faridul Khan
  • 1,741
  • 1
  • 16
  • 27
  • 1
    Do this when you want to allow an attacker to increase the CPU load on your server to have to process 3,000 HTTP queries instead of just 1,000. ︀ – John May 23 '22 at 16:37
  • @John [max_input_vars](https://www.php.net/manual/en/info.configuration.php#ini.max-input-vars) is the number of input variables, not the number of accepted HTTP request – catcon Nov 30 '22 at 04:33
  • @John: yes, the max_input_vars limit the number of input variable in **each** request, which minimize the chance for hackers to exploit hash collision which might trigger DDoS attack. The setting **does not** limit the number of HTTP request like what you said in your first comment. – catcon Nov 30 '22 at 21:59
22

Something is wrong, you do not need 1000 variables. Recode your program to accept 1 variable array with 1000 keys. This is error is there to warn you that you are not doing things the recommended way. Do not disable it, extend it, or hide it in any way.

chugadie
  • 2,786
  • 1
  • 24
  • 33
  • what if its Magento with 1.9 GB ? – inrsaurabh Aug 29 '18 at 10:55
  • I have a table of 70 rows and 13 columns, with 2 text boxes in each cell. Yeah, I'm definitely not doing things the recommended way. `^_^` – ADTC Nov 08 '18 at 05:08
  • I tried using associative arrays for the variable names, however I'm still getting the warning (and the data is truncated still). I expected an array within the POST array to be counted as one variable, but apparently it's still counted as the number of elements in the array. – ADTC Nov 08 '18 at 05:38
  • 2
    I'm now using JavaScript to convert the table of text boxes into a JSON array, then passing that in POST as a single variable. In PHP, I'm decoding that into the associative array I need. Works perfect, just 1 variable instead of 1,820 variables `:)` – ADTC Nov 09 '18 at 00:06
  • 3
    This is incorrect. I have 2 variables with 500 keys each, which triggers the error – Catalin Jun 10 '21 at 06:38
17

I think most of the time it is not necessary to increase the max_input_vars size,

but to optimize your code,

i had faced this problem when getting all results from one ajax request and sending that results to another ajax request.

So what i have done is stringified the array from the db results,

JSON.stringify(totalResults);

in javascript JSON.stringify convert array to string so after converting i have send that string to another request and decoded that string to array again using json_decode in php,

<?php $totalResults = json_decode($_POST['totalResults']); ?>

so i got that original array again,

i hope this can help someone so i have shared it, thank you.

Haritsinh Gohil
  • 5,818
  • 48
  • 50
  • 2
    yep, this was precisely my situation. I could have easily changed max_input_vars to some greater number. But how would I know that bigger number was biggest for dataset sizes that I have no control over? So these conversions are what I now do. Now all I have to worry about is post_max_size.... – Jim Jan 31 '20 at 15:08
  • 1
    This is the correct answer for cases where you're passing data back and forth between JS and PHP. Thanks! – Fid Jul 13 '22 at 10:49
16

I found out that the right way to handle json data directly in PHP (via file_get_contents('php://input')) is to make sure the request sets the right content-type i.e. Content-type: application/json in the HTTP request header.

In my case I'm requesting pages from php using curl with to this code:

function curl_post($url, array $post = NULL, array $options = array()) {
  $defaults = array(
    CURLOPT_POST => 1,
    CURLOPT_HEADER => 0,
    CURLOPT_URL => $url,
    CURLOPT_FRESH_CONNECT => 1,
    CURLOPT_RETURNTRANSFER => 1,
    CURLOPT_FORBID_REUSE => 1,
    CURLOPT_TIMEOUT => 600
  );
  if(!is_null($post))
    $defaults['CURLOPT_POSTFIELDS'] = http_build_query($post);
  $ch = curl_init();
  curl_setopt_array($ch, ($options + $defaults));
  if(($result = curl_exec($ch)) === false) {
    throw new Exception(curl_error($ch) . "\n $url");
  }
  if(curl_getinfo($ch, CURLINFO_HTTP_CODE) != 200) {
    throw new Exception("Curl error: ". 
      curl_getinfo($ch, CURLINFO_HTTP_CODE) ."\n".$result . "\n");
  }
  curl_close($ch);
  return $result;
}

$curl_result = curl_post(URL, NULL,
    array(CURLOPT_HTTPHEADER => array('Content-Type: application/json'),
      CURLOPT_POSTFIELDS => json_encode($out))
    );

Do note the CURLOPT_HTTPHEADER => array('Content-Type: application/json') part.

On the receiving side I'm using the following code:

$rawData = file_get_contents('php://input');
$postedJson = json_decode($rawData,true);
if(json_last_error() != JSON_ERROR_NONE) {
  error_log('Last JSON error: '. json_last_error(). 
    json_last_error_msg() . PHP_EOL. PHP_EOL,0);
}

Do not change the max_input_vars variable. Since changing the request to set right headers my issue with max_input_vars went away. Apparently does not PHP evaluate the post variables with certain Content-type is set.

Fredrik Erlandsson
  • 1,279
  • 1
  • 13
  • 22
6

According to php manual max_input_vars in php.ini role is:

How many input variables may be accepted (limit is applied to $_GET, $_POST and $_COOKIE superglobal separately). Use of this directive mitigates the possibility of denial of service attacks which use hash collisions. If there are more input variables than specified by this directive, an E_WARNING is issued, and further input variables are truncated from the request. This limit applies only to each nesting level of a multi-dimensional input array.

You just have to attribute greater number to max_input_vars in your php.ini.

ReDetection
  • 3,146
  • 2
  • 23
  • 40
Mateusz
  • 3,038
  • 4
  • 27
  • 41
  • So, since I can't really predict how long my JSON data will be I should change this to infinity!? Would it be better to submit my JSON as a POST var (like when uploading a file to PHP)? – Fredrik Erlandsson May 09 '13 at 20:46
  • Even if you send it in one POST variable, you'll eventually have to parse it and you'll have to create more variables (unless you're processing whole JSON string and nothing more). It's quite rare that you don't know how long JSON object can be, but in this situation you can either change it into `INF` or something very big. Beware of DoS though! – Mateusz May 09 '13 at 20:51
  • _"This limit applies only to each nesting level of a multi-dimensional input array."_ Where did you get this? Seems like this isn't true. From a quick test, it looks like the limit applies to the total number of cells in the array, not the nesting level. – ADTC Nov 08 '18 at 05:36
5

Realy, change max_input_vars using .htaccess file is working but you need to restart Apache service.

Follow the complete process:

  1. Go to C:\xampp\apache\conf\httpd.conf
  2. Open file httpd.conf
  3. In file look for xampp/htdocs
  4. A bit lower, you may see a line like this: AllowOverride
  5. If this line show # before AllowOverride delete the #
  6. Next line after AllowOverride insert php_value max_input_vars 10000
  7. Save the file and close
  8. Finally STOP the Apache and Restart (It will works just after restart apache)
techraf
  • 64,883
  • 27
  • 193
  • 198
Marco
  • 119
  • 1
  • 3
1

Changing max_input_vars in php.ini didn't worked for me

Change max_input_vars using .htaccess file is working.

If you want to change the “max_input_vars” using .htaccess file then go to C:\xampp\apache\conf\httpd.conf file (in xampp) add the following code in you .htaccess file.

php_value max_input_vars 10000

Dasun Lahiru
  • 41
  • 1
  • 4
1

I was getting slashes issue when using response data to make another AJAX Call and I used stripslashes() function to get around that.

Javascript AJAX:

course_data: JSON.stringify(response.data)

PHP Backend:

$course_data = json_decode(stripslashes($_POST['course_data']));
Md. Zubaer Ahammed
  • 931
  • 11
  • 13
0

I had the same problem. I turns out I was sending a 2Mb file in POST data, which sometimes is interpreted weirdly as JSON and it finds all these variables.

I ended up sending the information using file parameters but the file content was still in POST data.

curl_setopt($ch, CURLOPT_PUT, true);
curl_setopt($ch, CURLOPT_CUSTOMREQUEST, 'POST');
curl_setopt($ch, CURLOPT_INFILE, ($fp = fopen("filename.txt", "r")));
fclose($fp);
Mikaël Mayer
  • 10,425
  • 6
  • 64
  • 101
0

I am running Apache Server on pc and this is how i solved the issue mentioned above.

  1. Open php.ini file.

  2. Uncomment ; max_input_vars = 1000 by removing the semicolon

  3. Change to 10000 like max_input_vars = 10000

0

If it happens while running through visual studio code

create/update the file .user.ini

add the line

max_input_vars = 2000

save the file and restart the server

Merrin K
  • 1,602
  • 1
  • 16
  • 27
-3

<rant>I can accept that PHP has such a limit in place; it does make sense. What I cannot accept (and is one of the many reasons that make it very difficult for me to take PHP seriously as a programming language) is that processing then just continues with the truncated data, potentially overwriting good data with incomplete data. Yes, the data should be validated additionally before persisting it. But this behavior is just begging for troubles.</rant>

That said, I implemented the following to prevent this from happening again:

$limit = (int)ini_get('max_input_vars');
if (count($_GET) >= $limit) {
    throw new Exception('$_GET is likely to be truncated by max_input_vars (' . $limit . '), refusing to continue');
}
if (count($_POST) >= $limit) {
    throw new Exception('$_POST is likely to be truncated by max_input_vars (' . $limit . '), refusing to continue');
}
if (count($_COOKIE) >= $limit) {
    throw new Exception('$_COOKIE is likely to be truncated by max_input_vars (' . $limit . '), refusing to continue');
}

Note that truncation doesn't necessarily happen at the limit. My limit was set to the default 1000, but $_POST still ended up having 1001 elements.

jlh
  • 4,349
  • 40
  • 45