This kind of load is certainly manageable depending on how much data you expect each field to contain, or rather, the maximum amount of data you determined each field can contain. In PHP, the maximum size of the body of an HTTP POST request (that's the part that contains the form encoded values) is determined by the ini value post_max_size
. It has a default of 2MB, but you can change this in your php.ini
:
post_max_size = 10M # megabytes
Or in your .htaccess
:
php_value post_max_size 10M
Take care when setting this because it should be no more than the amount of RAM available on your system. Also consider that you could have multiple users requesting this page, and if each of them gets an exorbitant amount of RAM allocated for their request, they could hang or crash your server.
However, consider the math here. Even if you had 100 fields each with 20 bytes in them, that would only be 2000 bytes which is about 2 KB. Even with a 1 Mbps upload speed, which is pretty slow, you users will be able to upload 128 KBs per second. At this speed, each of the 100 fields would have to contain 1311 bytes of data for the upload process to take 1 second.
On Apache, the default timeout is 300 seconds, so your form fields would have to contain a combined total of 37.5 MBs before Apache would time out. This setting might be slightly altered by your host (or your server admin) and is probably set to a more reasonable value such as 30 seconds. But still at this limit, you would need 3.75 MB of data which is likely way more than what 100 fields can contain.
You should also not be concerned about the clientside, because even the stingiest browser (IE) limits POST uploads to 2 GB.
Basically, my point here is that even with a slow connection, HTTP and your server are well capable of handling that many fields. I'm not sure how long it would take for PHP to parse all of them (you'd have to benchmark it on your server), but I imagine the impact will be negligible.
From a user's standpoint, I'd say that 100 fields would be a pretty daunting sight. If at all possible, it might be nicer to separate your form into friendlier and smaller steps that walk the user through the process of filling out the form. If you would rather not split the form into steps, at least look into saving the state of the form with javascript. Note that in the W3C recommends 5MB of storage space for localStorage
, so this should be plenty of space to store all of your fields. Also look at this fallback that uses cookies, but be weary. Cookies have more limits than localStorage
. I've read that cookies are limited to 4KB each and 20 cookies per domain. You might want to distribute your stored form fields into several cookies, say 10 form fields in 10 cookies. You can store multiple input values in a cookie using encodeURIComponent()
:
var inputs = document.forms[0].getElementsByTagName('input')
i = 0,
date = new Date(),
expires;
// Expires date (1 day in future)
date.setTime(date.getTime()+(24*60*60*1000));
expires = date.toGMTString();
for(var cookieNumber = 0; cookieNumber < 10; cookieNumber++) {
var cookie = [];
for(; i < (cookienumber * 10 + 10); i++) {
cookie.append(encodeURIComponent(inputs[i].name) + '=' + encodeURIComponent(inputs[i].value));
}
document.cookie = 'savedForm' + cookieNumber + '=' + cookie.join('&') + '; expires =' + expires;
}
To ensure that everything is saved as the user types it in, you might want to update your stored data onchange
or, if you want up to the second saves, onkeyup
.
Also, as another convenience to the user, when the form is submitted, all saved cookie and localStorage
form field data should be cleared so when they visit the form again all of the fields will be empty and read for new data input.