0

I want to make stress test to see if there is locks or deadlocks happen, when 400 different rows updated per second with one select on random row per second on the same time.

I have methods in PHP that do it. The question is how can I simulate the 400 updates simultaneously (like from different users)

Regarding this answer: Simultaneous Requests to PHP Script

Will the requests be queued?

the requests come from the same client AND the same browser; most browsers will queue the requests in this case, even when there is nothing server-side producing this behaviour.

I'm trying multi curl right now. Sending 400 requests at the same time. But I only get now around 25 updates per second.

Community
  • 1
  • 1
Alex Kalmikov
  • 1,865
  • 19
  • 20
  • 1
    I wouldn't do this with PHP and/or curl.... I'd use a dedicated performance testing tool like Gatling – Mark Baker Mar 31 '15 at 12:24
  • Thnx, Mark, I'll give it a look – Alex Kalmikov Mar 31 '15 at 12:26
  • 1
    Are you using MySql database? If so check the tables engine is innodb and not myisam. That should increase the number of updates per second, since innodb engine lock row-level instead of table-level(as myisam) when executing insert/update operations – YyYo Mar 31 '15 at 13:05
  • Yes, I'm using InnoDB mysql. But as I heard in some cases, when multiple rows updated and this rows are close to each another, InnoDB may lock whole table instead of rows. Thats the case I want to test. – Alex Kalmikov Mar 31 '15 at 13:52

1 Answers1

0

I wouldn't do this with PHP and/or curl.... I'd use a dedicated performance testing tool like Gatling

That's it. Great tool Mark, thanks a lot.

Alex Kalmikov
  • 1,865
  • 19
  • 20