In the development environment I have a single nodejs process which have socket.io listening on a given port. Once the client connects, an state needs to be maintained. The nodejs script do some work over files and sends the status of the processing through sockets. It's like a batch which may never end.
If the client closes the browser and then open again (in the development environment), the web page simply connects again to the socket and grabs the current state of the process, which will be handling the files processing in the background for that specific user.
In order to have a single process per user and ensure that the users always reconnect to that same process they started, on the same port, I need to manage those process, one per user as a service on the server and book the ports for each user.
How would I build this scenario as a single nodejs entry point that forks the processes and route the socket connections accordingly using perhaps Nginx, PM2 and or Nodejs clusters. In other words, which is the best architecture for production in a scenario like that?
About the state problem:
The state I'm holding are not simply variables, which can be stored in database. I have continuous file read streams that are processed one after another in the order the user has configured through the web page connected using sockets to that process. This process also connect to another socket server-to-server and the connection must be kept.