0

I have been working on something using AJAX that sometimes requires a couple POST's per second. Each post returns a JSON object (generated by the PHP file being posted to, ~11,000 bytes) and on average the latency is between 30ms and 250ms depending on if i'm on wifi or wired, but every roughly 1/15 calls it spikes up to about 4000ms. I am trying to find a way around this, as of right now I see two options:

  1. Throw a timeout on the AJAX call, and have it call a GET on fail (the POST should still go through, it's the return trip that always times out) or...

  2. Cut the entire thing down, learn node.js so I can use websockets to potentially rectify this issue.

Either solution as far as I can see is based on WHY the original call is failing. If it is something wrong with the AJAX call, then a new GET should be likely to go through, and solve the issue. but if it is something with the server itself then logically the GET would just time out as well, it's an issue with the server, and I'm dead in the water.

Since I have no experience yet at all with websockets I was hoping for some feedback on the best action to take next. Thanks for the input.

If it would help, I can very likely reduce the returning payload to 1/15 the size with some sneaky coding. would that make an impact?

  • You can [use websockets with PHP](http://stackoverflow.com/q/12203443/1048572) as well, they're not restricted to node.js. – Bergi Aug 01 '14 at 14:15
  • I wouldn't do the GET thing... what happens if you GET before the server has had a chance to process the POST? – Gio Aug 01 '14 at 14:16
  • If the network as fine (and the browser not buggy), there must be a problem with your server. Investigate. – Bergi Aug 01 '14 at 14:18

2 Answers2

1

WebSockets are a great option! Actually working with SocketIO is pretty simple and has a shallow learning curve.

  • Because the connection stays open, your requests skips the DNS lookup and routing for lower latency. This mean much lower overhead for each POST request you make.
  • If you ever forsee pushing data to your users, WebSockets are the de facto way to do it. Ajax polling is going out of style.

That said, you would have to port your back end logic to JavaScript. You would have to change your deployment strategy to a server that supports Node apps. You would have to deal with learning a new environment -- can also have some overhead.

Before you explore Node, consider the drawbacks above. I think it is a great bit of technology, but I would also look into the following approaches, especially if you are pressed for time.

  • 1/15 size reduction is totally worth. Actually, it is worth it in both cases.
  • Can you do any sort of batching with your POST requests from the client side? If subsequent requests rely on the results of the previous POST request, you cannot do this. In this case, I strongly suggest using WebSockets.

All in all, there are always tradeoffs. If you aren't pressed for time, given Node and SocketIO and whirl, they are becoming very prevalent web technologies and are worth learning.

adu
  • 947
  • 1
  • 8
  • 15
  • Thanks, Batching my posts is not possible in this scenario unfortunately, I will do some tests judge how much the reduced payout effects things, and look into websockets more. Is there any prereq a server needs to have to run node? or is it something I can just apt-get? – user3404465 Aug 01 '14 at 14:37
  • Yeah, you can if you are on Linux! [Nodejs w/ Package Manager](https://github.com/joyent/node/wiki/Installing-Node.js-via-package-manager#ubuntu-mint-elementary-os) – adu Aug 01 '14 at 18:03
1

Cut the entire thing down, learn node.js so I can use websockets to potentially rectify this issue.

There is no point doing things with wrong tools. If you need real-time communication, use servers which support it out-of-the box, like node.js (probably the simplest to get into from PHP).

Since I have no experience yet at all with websockets

Get some framework atop of the raw websockets, like primus or socket.io and good luck ;)

c69
  • 19,951
  • 7
  • 52
  • 82