0

I have a php file that accesses a mysql database in the background given a few constraints and then returns some data.

The only issue is, if I get an influx of requests all at once then the last requests have to wait a while to get their response.

Is there a better way to handle this kind of thing so that the requests are fast and noone has to wait?

** Added Info

Currently my script takes in a few POST parameters and then goes to the database and verifies some information and then echoes a json encoded response string.

If I make 1000 requests through a XMLHTTPRequests in JS then the 1000's request comes back almost a minute later.

As far as I know php handles requests one at a time, but then how do major sites like Facebook handle when thousands of thousands of users update info at the same time?

user3916570
  • 780
  • 1
  • 9
  • 23

2 Answers2

1

How to the big guys do it?

  • Multiple web servers
  • Multiple MySQL Slaves
  • Efficient PHP (or whatever) code
  • *nix, not *dows
  • Serious load balancers, routers, etc
  • Hardware RAID controller with write cache in front of multiple drives
  • PHP Handles 1 request at a time, but Apache runs multiple PHP threads
  • No Query Cache -- the tables are changing too frequently.
  • Good indexes
  • etc.

(I used to work for one of the biggest guys.)

Rick James
  • 135,179
  • 13
  • 127
  • 222
0

I don't think the issue is PHP or really even MySql, but I think you have you MySql database not setup quite right.

So I assume in PHP you have a query something like.

select * from mytable where email = ? and you return this as JSON.

Now if you have 100k's of rows in your MySql database this will be slow if you are not indexing the field or fields in your where clause.

See this answer on how to index MySql tables. How do I add indices to MySQL tables?

Short answer you need to index any fields in your where clause. Eg if you had only email in the where clause, then you need to index email (only once is required to tell MySql to always create an index)

ALTER TABLE `table` ADD INDEX `email` (`email`)

This will result in MySql going SUPER fast and 1000 requests wont take 60 seconds. Actually from this page, you should be able to do 92,355 request in 1 second going to a database and back.

https://www.techempower.com/benchmarks/ (look for php-raw).

UPDATE

Not PHP's issue, OP is simply using XML http from a browser to do testing, since the browser can only make between 2-6 simultaneous requests to the same domain, the testing procedure was simply floored.

Prashanth
  • 328
  • 5
  • 21
John Ballinger
  • 7,380
  • 5
  • 41
  • 51
  • This had no effect on the time it takes to process the 1000 requests – user3916570 Aug 11 '15 at 02:01
  • I would only suggest you post your code as 1000 requests should not take 60 seconds. How are you testing by the way??? – John Ballinger Aug 11 '15 at 07:29
  • I have another HTML/JS file that creates 1000 xmlHTTPRequests to that php file with basic info and then seeing how long those responses take to come back – user3916570 Aug 11 '15 at 13:25
  • I would use a tool like Apache Bench. Here is an example of an app doing 1000 requests in 3 seconds. https://www.digitalocean.com/community/tutorials/how-to-use-apachebench-to-do-load-testing-on-an-arch-linux-vps this tells me you are doing something which most likely is you XMLhttp request. Especially since your browser can only make approx 6 concurrent requests to the same domain. http://www.coderanch.com/t/631345/blogs/Maximum-concurrent-connection-domain-browsers SO. PHP is not slow, its your testing. www.getcharles.com can also do testing to url (disclaimer made by a friend). – John Ballinger Aug 11 '15 at 22:54