I am building a file synchronization program (not unlike Dropbox) using node.js on both ends. I need to have potentially thousands of clients requesting data at the same time.
Here is my current system:
- Server pushes notifications to client over a websocket (file has been updated)
- Client queues downloads and makes an HTTP request when idle
I will be serving data in compressed chunks of, say, 50 MB each, so the HTTP request overhead (headers) is negligible.
If I were to use websockets for requests and push notifications, would there be:
- Noticeable overall speed improvements? (reduced latency, authentication, etc.)
- Additional overhead on the server to keep connections open?
- Issues with pushing binary data?
I think I need to have notifications sent over a dedicated websocket because I don't want them to be queued on the server while a download is taking place (lots of overhead).
Note: These websockets will be open long-term, as long as the client's system is on.
EDIT: I will be using the websockets on a different http server on different ports in order to move them to different CPU cores. I could potentially have thousands (if not hundreds of thousands) of concurrent websockets open...