I am writing a system with multiple user type granted different levels of access to create remove and update resources in the database. To my thinking these resources should therefore be displayed in real-time so that deleted resources are not shown and created ones are. The more timely the data the less problematic the user experience seems to become.
Using socket.io I am currently circumventing this problem by subscribing a user to the data sources their privileges require using namespaces to differentiate between the abilities of user types. I simply set up JavaScript intervals for each kind of data subscription which hammers the database every second for a fresh set of data taking no regard towards whether or not the data has actually updated I then send this back to the user application.
This is fine for testing but I have been struggling to find a solution to prevent this waste of bandwidth and reduce the database load. One idea I have had is to populate a global object with all the organisation ids as keys and in each value put an object with the name of each data subscription as keys with the value of these being the up to date set of data.
In this way I could add each new socket to a room on them joining and refactor the database queries to determine the changes to each subscription set which I could relay to the relevant organisation rooms as a series of socket.io broadcasts.
My main reservations with this implementation are the possibility for high server side memory requirements reducing scalability and the possibility of client data sets becoming out of sync if they miss any of these broadcasts. Is this solution an acceptable one for a production server or is there a simpler/better one which I am not seeing to keep large data sets real time from the client perspective?
thanks