1

I am getting started with Server Sent Events (SSE) since my web app requires receiving real time updates from the server. It does not require sending anything to the server, therefore SSE was chosen over Websockets.

After reading through some examples, I have the following code:

On my server, in ./src/routers/mainRouter.js I have:

router.get('/updates', (req, res) => {
    res.writeHead(200, {
        'Content-Type': 'text/event-stream',
        'Cache-Control': 'no-cache',
        'Connection': 'keep-alive'
    })

    // Listens for 'event' and sends an 'Event triggered!' message to client when its heard.
    eventEmitter.addListener('event', () => {
        console.log('Event triggered! Sending response.')
        res.write('data: Event triggered!\n\n')
    })

    req.on('close', () => {
        console.log('Connection to client closed.')
        res.end()
    })
})

module.exports = router

On my client, in ./app/index.js I have:

const source = new EventSource('/updates')

source.onmessage = (e) => {
    console.log(e)
}

There are 2 issues I am having:

  1. Once I open a connection from the client side and then close the connection (by closing the tab), the 'close' event fires twice resulting in the code block within req.on('close') running twice. I am not sure why this happens. My console on the server side looks like follows:

    Event triggered! Sending response.
    Connection to client closed.
    Connection to client closed.
    
  2. More importantly, although req.end() is called, the router still keeps listening for events on that channel and tries to send responses down that channel resulting in a ERR_STREAM_WRITE_AFTER_END error and the server crashing. So the final console output looks like:

    Event triggered! Sending response. // First event triggers.
    Connection to client closed. // 'close' event fires.
    Connection to client closed. // 'close' event fires a second time (not sure why).
    Event triggered! Sending response. // Router continues listening for 'event' and sends another response although res.end() was called earlier
    events.js:187
          throw er; // Unhandled 'error' event
          ^
    
    Error [ERR_STREAM_WRITE_AFTER_END]: write after end
    
philosopher
  • 1,079
  • 2
  • 16
  • 29
  • This is same problem, same solution, as this question asked a few days before: https://stackoverflow.com/q/59751406/841830 – Darren Cook Jan 19 '20 at 08:17
  • @DarrenCook if you read the answer within that link, you will see that the marked answer recommends using response.end() which was exactly what I was doing and it was causing a problem. I searched and read a lot of topics on SO about SSE before I created this question. Also the comment discussion within the answer on this page (if you read it), it quite insightful to say the least. – philosopher Jan 20 '20 at 17:09
  • That was my answer, and my update (once more code was added to the question) explains that if you used `addListener` that you have to use `removeListener`. (I wasn't flagging your question as a duplicate; just linking to it for additional perspective for the next person whose search brings them to either that question or this one.) – Darren Cook Jan 20 '20 at 21:38

1 Answers1

4

When the stream closes, you need to remove your event listener so you won't try to write to the stream again. That could be done like this:

router.get('/updates', (req, res) => {
    res.writeHead(200, {
        'Content-Type': 'text/event-stream',
        'Cache-Control': 'no-cache',
        'Connection': 'keep-alive'
    });

    function listener(event) {
        console.log('Event triggered! Sending response.');
        res.write('data: Event triggered!\n\n');
    }

    // Listens for 'event' and sends an 'Event triggered!' message to client when its heard.
    eventEmitter.addListener('event', listener);

    req.on('close', () => {
        // remove listener so it won't try to write to this stream any more
        eventEmitter.removeListener('event', listener);
        console.log('Connection to client closed.');
        res.end();
    });
});

module.exports = router;

FYI, I don't think you need the res.end() when you've already received the close event. You would use res.send() if you were unilaterally trying to close the connection from the server, but if it's already closing, I don't think you need it and none of the code examples I've seen use it this way.

I wonder if it's possible that your res.end() is also why you're getting two close events. Try removing it.

jfriend00
  • 683,504
  • 96
  • 985
  • 979
  • Thanks for the reply. This makes complete sense. After reading the documentation about EventListeners, I realize that the above code will create as many instances of event listeners as there are clients. This might result in a memory leak if my app scales. Is there a way to restructure this code such that there is only one instance of an eventListener added to eventEmitter yet achieving the same result? The relevant documentation talking about memory leaks: https://nodejs.org/api/events.html#events_emitter_setmaxlisteners_n – philosopher Jan 18 '20 at 08:12
  • @philosopher - This is the correct model for your listener. eventEmitters are built just for this purpose when there are lots of possible listeners for the same event. While the EventEmitter may "warn" about a memory leak if you have more than 10 listeners for the same event, it is NOT a leak as long as you remove the listeners that are no longer active (the code we just added). And, in fact, this is exactly how you should be using a listener on an EventEmitter object. That's what they are for. – jfriend00 Jan 18 '20 at 08:35
  • @philosopher - You could develop your own means of sharing the event (creating an array of callbacks to be cycled through on each event), but that's just duplicating the EventEmitter functionality that it already does quite well - no reason to do that. – jfriend00 Jan 18 '20 at 08:35
  • Thank you again for sharing your knowledge! Not totally relevant but to elaborate on the problem I am trying to solve: I have a method that I call in app.js that sometimes makes changes to the state of the DB. I am trying to push those DB changes to the client using my SSE route. Whenever the aforementioned method changes DB state, it emits an event that will be listened for in the SSE route and pushed to the client. Is this the best way to solve the problem at hand or is there another way to link a method called in app.js to the SSE route that eventually pushes real time data to the client? – philosopher Jan 18 '20 at 08:46
  • @philosopher - This seems like the right way to do it. It's a good use of an `eventEmitter` because your routes are not coupled to the actual data source at all, just listening for an event and you can change the data source implementation at any time without having to change the routes at all. – jfriend00 Jan 18 '20 at 09:16
  • @jfriend00 Can you please help me, I am SSE event to send data in streams but my data is getting sent all at once at `res.end()`, I have even tried `res.flush()` after every `res.write()` but nothing is working. My question is here https://stackoverflow.com/questions/61412538/ndoejs-sse-res-write-not-sending-data-in-streams – Sudhanshu Gaur Apr 25 '20 at 19:22
  • @SudhanshuGaur - Sorry, but I'm not familiar with SSE streams. – jfriend00 Apr 25 '20 at 19:33
  • @jfriend00 Actually what you are doing in code is same as SSE(Server Sent events) coz you are creating header as `'Content-Type': 'text/event-stream',` which means to send data in streams, I am doing same code as yours in my nodejs but the problem happening in my case is `res.write()` is not sending data and all data is being sent at once when I call `res.end()`, do you know anything why is this happening or any solution, thanks anyways :) – Sudhanshu Gaur Apr 25 '20 at 19:39