I'm currently using AJAX and PHP to send updates to a postgreSQL database.
Say I had 1000 users all sending one ajax post request per second to a php script. Say that php script opened a connection, executed two SQL update commands every time it was run, and then closed the connection.
That would be 1000 connections per second - I'm guessing that isn't going to work out very well for me.
If it's not, how should I deal with it? I've read that node.js is a good solution - If it is are there any good guides for dealing with updating a postgreSQL from a webpage using javascript?
I already have data (some json, some other) in the postgreSQL database and it needs to stay in there, so ideally I would be able to just change the way the handshake between javascript and the database works and leave the rest the same.
As a side question: How many connections per second should I expect to be able to handle if that's my only bottleneck? And if there are more than the max 150 connections does it just queue the connection or does it do something obnoxious like post a message saying 'max connections hit' and not allow page loads?