-1

I am accepting rather complex form data from clients (web and mobile) to a PHP server. When I receive the data, I need not process it in any way, I just need to store it somewhere and let the client know that the form was successfully submitted. I batch process the data later.

What's the best way to quickly store the incoming form data?

Two functions seem relevant in this context:

Also, instead of sending this serialized data to database, it would be lot faster to append it to some text file, right?

Can someone give me more information about these things?

treecoder
  • 43,129
  • 22
  • 67
  • 91
  • 1
    json encode it and write to a file where the script has write permissions. – r3wt Nov 30 '14 at 23:42
  • I think serialize would be less time consuming than JSON encode. – treecoder Nov 30 '14 at 23:43
  • 1
    Perhaps, using Redis to store serialized or json encoded form data would be an elegant solution. – Ali MasudianPour Nov 30 '14 at 23:44
  • I'd probably just use a database, especially if this was more convenient. Remember it doesn't have to be "fastest", just "fast enough". Also, things may be faster/slower depending on your data, so a good answer to this question is "measure what is fastest". – halfer Nov 30 '14 at 23:45
  • "Also, instead of sending this serialized data to database, it would be lot faster to append it to some text file, right?" — not sure there would be much difference. Creating/appending to a text file could take as much time as writing into MySQL. Maybe more... – Christian Bonato Nov 30 '14 at 23:45
  • 1
    Do you want to be able to scale? Do you want durable data? You won't achieve either of those if you store data in a *local* file. – Karoly Horvath Nov 30 '14 at 23:52
  • @treecoder google this: `serialize vs json_encode php` – r3wt Nov 30 '14 at 23:53
  • Thanks for `serialize vs json_encode php`. I got all my answers now! – treecoder Nov 30 '14 at 23:57
  • http://stackoverflow.com/questions/804045/preferred-method-to-store-php-arrays-json-encode-vs-serialize – treecoder Nov 30 '14 at 23:59

3 Answers3

0

If you need just get text from POST you could get plain POST request without any serialization.

echo file_get_contents('php://input');

Appending to a file is pretty fast. But remember that you will need also manage this file.

P.S. It looks like premature optimization.

sectus
  • 15,605
  • 5
  • 55
  • 97
-1

I would consider

base64_encode(serialize($_POST));

// assuming $array is the returned mySQL column (BLOB type) 
unserialize(base64_decode($array));

Please note that base64_encode is optional and depends on the data stored, which may or may not need to be encoded.

Christian Bonato
  • 1,253
  • 11
  • 26
  • Well, if like me you had much experience of storing all kinds of data into MySQL (e-mail bodies, PHP code, jpeg files and strings containing all kinds of chars), base64_encode ensures smooth CRUD operations. But agree that in this case, it is overkill. – Christian Bonato Dec 01 '14 at 02:29
  • none of the above require base64 to be stored properly. it also adds about 30% in size. are you sure you know what your doing ? ;-) –  Dec 01 '14 at 02:33
  • Dear friend, as I said this come from real-world experience on massive apps. I just don't base64_encode stuff for fun. Else these people don't know what they're talking about as well : http://stackoverflow.com/questions/35879/base64-encoding-image – Christian Bonato Dec 01 '14 at 02:39
-2

The fastest way to save the data is not to use PHP, but to write a C extension to whatever web-server you're using which will simply save the file. Fastest is not the best. Unless you're talking about trillions of requests per hour, you're better off processing them at least some inline, and storing them into a relational DB or Object DB. I would personally go with MySQL or Mongo since I've some experience with them, but many other datastore could be appropriate.

Trying to outsmart databases by "simply appending to a file" will likely get you into a lot of trouble. Databases already have a lot of support for concurrent operations, optimizing writes, caching, transactional support, etc... If you "append" to a file from multiple web requests concurrently, you'll end up with corrupt data.

I know its a bit of a non-answer, but its been my experience that the type of question that you asked is misguided at best. If you truly absolutely need to do something "the fastest way possible", you're likely not going to need to ask about it on Stack Overflow, because you've gotten yourself into a cutting-edge development environment where you've had decade+ of experience in creating such implementations.

Daniel
  • 4,481
  • 14
  • 34