0

My PHP code allows users to execute arbitrary SQL code in a read-only user. It also needs access to a write capable user. My code executes commands in the "write capable user" and then using a separate connection queries the db using the "read-only user". It then queries using the "write capable user" again and exits the script.

It leaves the write capable user's connection open while it does the read-only user's queries. As far as I am aware this is the best way to do it but my college is concerned the is somehow MySQL bad practice and want to close the write capable user's connection and reopen it later (presumably because it doubles the number of concurrent connections to the MySQL.) What is the best way to do this?

whats more efficient and why: one db connection per page or one db connection per function? says that "Typically database connections are expensive to create."

Community
  • 1
  • 1
Bardi Harborow
  • 1,803
  • 1
  • 28
  • 41
  • 2
    Look at connection pooling. If you are serious about scaling your application then connection pooling is a must. http://stackoverflow.com/questions/18264415/efficient-way-to-connect-to-a-database-when-multiple-functions-are-running-queri/18264467#18264467 – Namphibian Oct 14 '13 at 22:35
  • 2
    @Namphibian: While your answer is correct in general, it does not apply to PHP because all connections are closed when the script ends. Opening persistent connections has it's own set of trouble and should only be considered if opening connections has been identified as the relevant source of bad performance. – Sven Oct 14 '13 at 22:37
  • thanks Sven I am not a PHP developer so not sure of the finer details. – Namphibian Oct 14 '13 at 22:40

2 Answers2

2

whats more efficient and why: one db connection per page or one db connection per function? says that "Typically database connections are expensive to create."

I'd challenge that statement. It might be true for some databases, but it need not be true for ALL of them. MySQL is known to be very lightweight when creating connections, and even more so when using local unix domain sockets.

The thing that is more interesting is: What if you use a feature that requires the connection to not change? Like inserting a dataset and then selecting LAST_INSERT_ID()? If you use the read-only connection, this will not work.

While I do think that using a read-only user account is beneficial for security, it only makes sense if that is the ONLY account used in a script. Otherwise you'd somehow have some logic that decides based on the kind of query what connection to use - and if you'd automatically use the right connection to read or write, using two connections does not make sense from the security perspective.

Also, you'd be unable to use SELECT statements during transactions if the SELECT is going to a different connection.

All in all: Using more than one connection to do the things that one connection could do as well seems like a bad idea - unless you can give some more reasons why you are doing it.

Reading your question once again, I accidentially stumbled upon your particular reason: You execute arbitrary SQL statements. That way, it really makes sense to use a restricted account. It also does not make sense to open and close the waiting write-enabled connection. The only reason to do that would be if the database server reaches his configured limit of concurrent connections.

Sven
  • 69,403
  • 10
  • 107
  • 109
  • I think I'd want seperate scripts in this case, one slip of the finger and you are executing arbitary sql against a writable connection. – Tony Hopkinson Oct 14 '13 at 22:50
  • We are executing arbitrary SQL statements to give developers full access to the database (all data is public). @TonyHopkinson, the write access is needed to complete the command, and yes, we will be careful. So you are saying that there is no problem in having two concurrent connections to the same database? I am personally fine with that, my college just needed some reassurance because he has never seen an example of this in any other code. Thanks for your help. – Bardi Harborow Oct 15 '13 at 21:13
  • Oh I've seen a fair few things using two concurrent connections, usually though you'd maintain them for a lot longer than one script though. Only way to find out if it might be a problem is to get some monitoring going, there's usually a limit and a lower practical one, but given the brevity, you'd have to start getting serious traffic to start choking I should think. – Tony Hopkinson Oct 15 '13 at 21:25
1

Your current solution is demonstrably fine as long as you can clearly keep track of which connection is which, and even has the advantage of making it easy to scale out to a master-slave(s) scenario. There is something to say for only opening a write connection when you need it, but in the short-lived world of web requests (which I assume we're talking about), as soon as you have it open it's just fine leaving it open in case you need it, or closing it automatically the .5 seconds later when the request has presumably ended.

If we would talk about permanently running daemons, by all means, close the connection after N seconds/minutes of no activity, and you'll probably have more then one connection anyway, making you able to run multiple queries asynchronously.

Wrikken
  • 69,272
  • 8
  • 97
  • 136