40

Here is my code using socket.io as WebSocket and backend with pub/sub redis.

var io = io.listen(server),
    buffer = [];

var redis = require("redis");

var subscribe = redis.createClient();  **<--- open new connection overhead**

io.on('connection', function(client) {

    console.log(client.request.headers.cookie);

    subscribe.get("..", function (err, replies) {

    });

    subscribe.on("message",function(channel,message) {

        var msg = { message: [client.sessionId, message] };
        buffer.push(msg);
        if (buffer.length > 15) buffer.shift();
        client.send(msg);
    });

    client.on('message', function(message){
    });

    client.on('disconnect', function(){
        subscribe.quit();
    });
});

Every new io request will create new redis connection. If someone open browser with 100 tabs then the redis client will open 100 connections. It doesn't look nice.

Is it possible to reuse redis connection if the cookies are same? So if someone open many browser tabs also treat as open 1 connection.

yojimbo87
  • 65,684
  • 25
  • 123
  • 131
user717166
  • 690
  • 2
  • 10
  • 17
  • 2
    I just wrote a [scalable socket.io sample](https://github.com/trantorLiu/Scalable-Socket.IO-Sample) you may want to take a look. – Trantor Liu Sep 27 '12 at 14:11
  • Here is one good [link](https://github.com/Automattic/socket.io/wiki/configuring-socket.io) – zangw Feb 03 '15 at 01:36

5 Answers5

64

Actually you are only creating a new redis client for every connection if you are instantiating the client on the "connection" event. What I prefer to do when creating a chat system is to create three redis clients. One for publishing, subscribing, and one for storing values into redis.

for example:

var socketio = require("socket.io")
var redis = require("redis")

// redis clients
var store = redis.createClient()
var pub = redis.createClient()
var sub = redis.createClient()

// ... application paths go here

var socket = socketio.listen(app)

sub.subscribe("chat")

socket.on("connection", function(client){
  client.send("welcome!")

  client.on("message", function(text){
    store.incr("messageNextId", function(e, id){
      store.hmset("messages:" + id, { uid: client.sessionId, text: text }, function(e, r){
        pub.publish("chat", "messages:" + id)
      })
    })
  })

  client.on("disconnect", function(){
    client.broadcast(client.sessionId + " disconnected")
  })

  sub.on("message", function(pattern, key){
    store.hgetall(key, function(e, obj){
      client.send(obj.uid + ": " + obj.text)
    })
  })

})
sintaxi
  • 1,858
  • 14
  • 10
  • 2
    just to be clear, this is only creating three redis clients in total regardless of how many users are connected. Adding another node process obviously results in more redis clients. – sintaxi Apr 21 '11 at 20:53
  • for sub.on('message'), why do you do client.send instead of client.broadcast? Thanks – noli Apr 27 '11 at 09:37
  • 4
    @Noli good question. you will notice that because we are subscribing to a redis channel within the socket "connection" closure, this is all that is needed to send everyone a messsage because the event "message" on the sub object will get triggered for every client who is connected. if we used client.broadcast() every person would see the message times the number of people in the room. – sintaxi Apr 29 '11 at 21:11
  • 4
    @Noli just to clarify, we could use broadcast but we would have to bind the listener outside of the "connection" closure so that the event only gets fired once. We also would have to change it to socket.broadcast() because the client object is not available to us. This may be better depending on the situation. Good catch :) – sintaxi Apr 29 '11 at 21:22
  • Why need to closure the `sub.on` instead: `sub.subscribe("chat"); sub.on("message", function(pattern, key) { store.hgetall(key, function(e, obj) { io.sockets.send(obj.uid + ": " + obj.text) }) }); io.sockets.on("connection", function(client) { ... }`? – ingeniarius Jul 19 '11 at 06:50
  • 21
    I know there's been a significant gap of time here, but I believe that the answer to this question is somewhat dangerous. Binding an event listener to the message event of sub will happen every time a new client joins the chat. This event listener will remain around after the client disconnects. This will result in a buildup of stale event listeners handling messages for clients who have already gone. – tabdulla Apr 27 '12 at 06:07
  • 4
    @tabdulla I know this response is even later, but I've figured out how to resolve the dangling listener issue. http://stackoverflow.com/questions/11617811/how-to-remove-redis-on-message-listeners/11617812#11617812 – hrdwdmrbl Jul 23 '12 at 17:54
  • @jackquack a better idea would be to have redis subscribe to the messages *outside* of the socketio logic. You could then have a global array of current socket connections which the subscribe logic can work with if necessary. – Mahn Sep 02 '14 at 22:19
2

Redis is optimized for a high level of concurrent connections. There is also discussion about multiple database connections and connection pool implementation in node_redis module.

Is it possible to reuse redis connection if the cookies are same? So if someone open many browser tabs also treat as open 1 connection.

You can use for example HTML5 storage on the client side to keep actively connected only one tab and others will handle communication/messages through storage events. It's related to this question.

Community
  • 1
  • 1
yojimbo87
  • 65,684
  • 25
  • 123
  • 131
  • I thought can be store the redis client into sessioncookie. So when next calling with same cookie then can reuse back the redis connection and do publish message. eg. var sessioncookie = redis.createClient(); So it better do it in server site. – user717166 Apr 21 '11 at 13:54
1

I had this exact problem, with an extra requirement that clients must be able to subscribe to private channels, and publish to those channels should not be sent to all listeners. I have attempted to solve this problem by writing a miniature plugin. The plugin:

  • Uses only 2 redis connections, one for pub, one for sub
  • Only subscribes to "message" once total (not once every redis connection)
  • Allow clients to subscribe to their own private channels, without messages being sent to all other listening clients.

Especially useful if your prototyping in a place where you have a redis connection limit (such as redis-to-go). SO link: https://stackoverflow.com/a/16770510/685404

Community
  • 1
  • 1
Josh Mc
  • 9,911
  • 8
  • 53
  • 66
  • The main problem is redis will eat memory for each client subscribe a channel. If have 50k concurrent users, it's not a good idea to implement this. – user717166 Jun 07 '13 at 05:25
  • Very good point, in my case it was for a game, and some client data needed to be sent to groups, and other to only single players. But I guess it would not be a go-ahead with 50k users ^^ – Josh Mc Jun 07 '13 at 08:06
1

You need to remove the listener when client disconnect.

var io = io.listen(server),
    buffer = [];

var redis = require("redis");

var subscribe = redis.createClient();  

io.on('connection', function(client) {

    console.log(client.request.headers.cookie);

    subscribe.get("..", function (err, replies) {

    });

    var redis_handler = function(channel,message) {

        var msg = { message: [client.sessionId, message] };
        buffer.push(msg);
        if (buffer.length > 15) buffer.shift();
        client.send(msg);
    };

    subscribe.on("message", redis_handler);


    client.on('message', function(message){
    });

    client.on('disconnect', function(){
        subscribe.removeListerner('message', redis_handler)
        //subscribe.quit();
    });
});

See Redis, Node.js, and Socket.io : Cross server authentication and node.js understanding

Community
  • 1
  • 1
nakwa
  • 1,157
  • 1
  • 13
  • 25
0

Using redis as a store has become much simpler since this question was asked/answered. It is built in now.

Note, that if you are using redis because you are using the new node clustering capabilities (utilizing multiple CPUs), you have to create the server and attach the listeners inside of each of the cluster forks (this is never actually explained anywhere in any of the documentation ;) ). The only good code example online that I have found is written in CoffeeScript and I see a lot of people saying this type of thing "just doesn't work", and it definitely doesn't if you do it wrong. Here's an example of "doing it right" (but it is in CoffeeScript)

cmcculloh
  • 47,596
  • 40
  • 105
  • 130