12

Starting to work on a new project using redis as a sub/pub system to display results from a mysql db. So if there are updates I wanted to publish those updates from mysql to my web page. My question is, which option would be better?

Option 1: Should I just do all of that via nodejs and socket.io? Meaning creating a nodejs script that connects to redis, subscribe to the channels I need to listen to, using mysql in nodejs query the db for updates, if updates publish the mysql rows then in the html that is connecting to nodejs via socket.io get the new data and process it to display the results?

Option 2: Have a php script query mysql and with redis-php client publish any updates to the channel? Dont know exactly what else needs to be setup from here. Do I still need to have nodejs involved in this option?

Or am I just off based on how all this works? The bottom line is I want to display results via mysql database to the user using redis sub/pub capabilities.

John
  • 9,840
  • 26
  • 91
  • 137
  • Why is your intention in adding additional layers to your architecture? I mean are there any specific reasons for using nodejs, socket.io and redis in your app? – Sukumar Jul 31 '11 at 18:34
  • @Sukumar My project consists of lists that need to be updated in real time. After reading about redis sub/pub system it seemed that would be a better fit resource wise and real time needs rather than using client side scripting with ajax to get updates from the db. NodeJS can handle a lot of connections and can multi task. Socket.io as an easy way to connect to the nodejs server. Like I said could be wrong, this is why Im posting. – John Jul 31 '11 at 18:38

1 Answers1

24

Option 3

When you update MySQL from PHP you publish those changes to node.js via redis publish command(publish from PHP when mutating database). From node.js I would receive those changes in real-time thanks to Redis's subscribe. Then I would just broadcast them to users interested via socket.io. You could for example publish to channel mysql. Take for example the following SQL statement => INSERT INTO comments (1, "Hello World"). Where 1 is something like userid, and Hello World would be something like the comment. I probably would not publish SQL-statement to that channel, but JSON instead which I can easily use both from JavaScript(JSON.stringify / JSON.parse) and PHP(json_encode / json_decode).

Update

You don't run a cron-job because this would defeat the purpose off Redis's pubsub. Take for example I visit your website which is a blog at http://localhosts. I read an article at http://localhost.com/a.php. Below on the site you provide a form which I can use to post a comment to that article:

a.php

<html>
<head>
    <title>Interesting blog post</title>
</head>
<body>
    <div id="article">This is interesting</div>

    <div id="comments">
        <div class="comment">
            <div class="from">Alfred Said at 22:34</div>
            <div class="message">Hello World</div>
        </div>
    </div>

    <form action="post.php" method="post">
        <label for="name">Your name</label><br />
        <input type="name" id="name" name="name" /><br />

        <label for="message">Your Message:</label><br />
        <textarea id="message" name="message"></textarea>

        <input type="submit" />
    </form>


    <script src='jquery.min.js'></script>
    <script src='http://localhost:8888/socket.io/socket.io.js'></script>
    <script type="text/javascript">
        $(document).ready(function () {
                var socket = io.connect('http://localhost:8888');

                socket.on('message', function (json) {
                    var obj = $.parseJSON(json);
                    alert('in here: ' + obj.name);
                });
        });
    </script>
</body>
</html>

I submit the form which has action attribute http://localhost/postcomment.php. But this is the important part! At post.php you retrieve the data I posted and insert it into MySQL using INSERT INTO comments (1, "Hello World"). When this mutation happens you also need to inform node.js process which is continually listening to channel mysql:

post.php:

<?php

$_POST  = filter_input_array(INPUT_POST, FILTER_SANITIZE_STRING);

require("./Predis.php");
$redis = new Predis\Client();
$obj = array(
    'name'      => $_POST['name'],
    'message'   => $_POST['message']
);

$json = json_encode($obj);
$redis->publish("mysql", $json);

echo $json;

post.php requires predis.

The node code with node_redis would look something like:

var redis       = require('redis'),
    subscriber  = redis.createClient(),
    express     = require('express'),
    store       = new express.session.MemoryStore(),
    app         = express.createServer(
        express.bodyParser(),
        express.static(__dirname + '/public'),
        express.cookieParser(),
        express.session({ secret: 'htuayreve', store: store}))
    sio         = require('socket.io');

app.listen(8888, '127.0.0.1',  function () {
    var addr = app.address();
    console.log('app listening on http://' + addr.address + ':' + addr.port);
});

var io = sio.listen(app);

io.configure(function () {
    io.set('log level', 1); // reduce logging
});

io.sockets.on('connection', function (socket) {
    socket.join('mysql');   
    socket.on('disconnect', function () {
    });
});

subscriber.on('message', function (channel, json) {
    // this will always retrieve messages posted to mysql
    io.sockets.in('mysql').json.send(json);
});

subscriber.subscribe('mysql');

This samples depends on the following packages, which you can install via npm

npm install socket.io
npm install redis
npm install express

Always when I post the form post.php, I also publish these changes to redis. This part is important! The node.js process is always receiving those changes thanks to Redis's pubsub. Every time when a php script mutates the database you should publish these changes to Redis with publish.

P.S: Hope this is clear. Maybe later when I have some time available I update with maybe little snippet...

Alfred
  • 60,935
  • 33
  • 147
  • 186
  • For the php script that updates mysql, how do I run that continuously? Via a cron job? If so doesnt that defeat the purpose of using nodejs? – John Jul 31 '11 at 22:57
  • updated answer and also include a little example! Hopefully it is clear now you don't need cronjobs but just use Redis's pubsub. – Alfred Aug 01 '11 at 02:30
  • Ok I think I understand. I believe the problem I was having wrapping my head around this was the whole client side update process. Kept thinking I had to query the db every so many seconds to get updates when it should only get updated when an action happen on the client side. IE adding comments, deleting comments, editing comments. Just so used to the client side doing all the work. – John Aug 01 '11 at 13:57
  • node.js is a process which is always running and available in the whole context(each request knows about the other request), while each PHP process is separated(each request does not know anything about other request). That's what you should wrap your head around. – Alfred Aug 01 '11 at 14:09
  • What if I need to join multiple channels that might be different based on the uers login? – John Aug 01 '11 at 15:39
  • I believe it is no problem to join multiple channels and socket.io will handle all of this for you. – Alfred Aug 02 '11 at 06:26
  • Can you explain what socket.join does and socket.in in your script above? And for the multiple dynamic channels would it be better to pass what channels to subscribe from the client to the nodejs server and do the socket.join and socket.in on each channel I want to subscribe to? – John Aug 02 '11 at 20:55
  • Search for `room` at https://github.com/LearnBoost/socket.io to learn about join/leave. – Alfred Aug 02 '11 at 21:56
  • Ok now I understand that part. One issue Im still hung up is why are you joining socket.io to the channel instead of just using redis and why are you using socket.io to publish the data in json to the channel? I thought redis could do all of this in nodejs? Or is this a more optimal method? – John Aug 02 '11 at 22:48
  • 4
    Redis is a very fast advanced key-value store which also has pubsub. Redis uses plain sockets. This means it does not use HTTP. I like to use Redis's pubsub to communicate with PHP in a very fast manner over sockets. to send changes in real-time to all browser we need socket.io. I hope this makes it all clear. – Alfred Aug 03 '11 at 00:03
  • Also the nice thing about redis is that there are already libraries available in almost any language. I could also just use plain socket to communicate between PHP and node.js, but the nice thing about Redis is that I don't have to do that. Also redis is probably faster because of C implementation! – Alfred Aug 03 '11 at 00:15
  • On the subscribe.on function in the nodejs script you have above, you manually set the channel like so: io.sockets.in('mysql').json.send(json); But in that function you pass channel,json. Could I just use the variable channel instead so it can pass the info to whatever channel I pass in the post.php script? – John Aug 03 '11 at 09:43
  • @John let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/2093/discussion-between-alfred-and-john) – Alfred Aug 03 '11 at 12:03
  • You are freaking awesome, that's it! – Maziyar Jul 22 '13 at 12:33
  • @Alfred Could you tell me which version of express and redis you used at the time of this answer(3 years ago)? I tested your solution at this time, but does not work because of express change. Could you refine your answer and make it working for now? – hamed Mar 30 '15 at 11:47
  • Could you please put here a link to the working source code? – tong Aug 10 '15 at 10:04