I have an app that uses native WebSockets on the browser and node ws on the server. I have been testing it on devices that do not have great Wi-Fi connections (on a five bar signal strength indicator they will range from two to five bars).
It seems that this really impacts performance. When a device has two or so bars, it looks like messages aren't transmitted to / received from the server. I can't really check to see if the connection has been dropped because of this issue but it doesn't seem right that just because a device has a weak signal, that messages would not transmit. Because they are based on TCP, don't WebSockets guarantee delivery of messages / and wouldn't a connection persist as long as there is any internet connectivity?