0

I have an app that uses native WebSockets on the browser and node ws on the server. I have been testing it on devices that do not have great Wi-Fi connections (on a five bar signal strength indicator they will range from two to five bars).

It seems that this really impacts performance. When a device has two or so bars, it looks like messages aren't transmitted to / received from the server. I can't really check to see if the connection has been dropped because of this issue but it doesn't seem right that just because a device has a weak signal, that messages would not transmit. Because they are based on TCP, don't WebSockets guarantee delivery of messages / and wouldn't a connection persist as long as there is any internet connectivity?

Community
  • 1
  • 1
Startec
  • 12,496
  • 23
  • 93
  • 160
  • Only if the TCP connection itself lasts. Even if there technically is a connection, it might be so bad the TCP driver just gives up. Or it keeps trying to transmit the same message from two hours ago. – John Dvorak Jan 27 '16 at 12:47
  • So if I send a basic message, for instance I press a button that sends "hello", then I press a button that says "hi", will "hello" always be sent before "hi"? (Is there some sort of queue?) – Startec Jan 27 '16 at 12:53

0 Answers0