0

I have a feature in my web app that allows users to upload and download files. I serve up the app with Express, but the files are stored in a different server, so I proxy the requests to that server. Here's the proxy code using the request library:

module.exports = function(req, res) {
    req.headers['x-private-id'] = getId();

    var url = rewriteUrl(req.url);

    var newRequest = request(url, function(error) {
        if (error) console.log(error);
    });

    req.pipe(newRequest).on('response', function(res) {
        delete res.headers['x-private-id'];
    }).pipe(res);
};

This works fine for all of my requests, including downloading the file. However, I run into issues when 'streaming' the file. And by streaming, I mean I use fancybox to display the video using a video tag. The video displays fine the first few times.

But if I close fancybox and then reopen it enough times (5 specifically), it quits working after that; the video no longer shows up. The entire Express server seems to hang, unable to process any more requests. If I restart the server, everything is OK. To me it seems like the sockets from the proxy requests aren't being closed properly, but I can't figure out why. Is there something wrong with my proxy code?

GJK
  • 37,023
  • 8
  • 55
  • 74
  • Is fancybox still transferring the video in the background even after you close the box? Check the network requests in your browser's debugging panel. Also when you rewrite the URLs, are they still pointing to the same server host and port? – mscdex Sep 11 '14 at 00:20
  • At first glance, it didn't look like the requests were still open. Chrome said that they were finished, but I guess it's possible that they weren't. Also, no, the requests are being proxied to another server. – GJK Sep 11 '14 at 00:22
  • By "same server" I meant the same remote server, not your Express server. – mscdex Sep 11 '14 at 01:34
  • That's what I meant to. We have another Amazon instance running a Jetty server that takes care of our file management. I change both the server and port when rewriting the URL. – GJK Sep 11 '14 at 11:31
  • Ah ok, that might explain why the sockets in the pool aren't being reused much and you're hitting the `maxSockets` quicker. – mscdex Sep 11 '14 at 13:21

1 Answers1

1

You need to either increase the pool.maxSockets value passed in the request() config since it defaults to node's HTTP Agent's maxSockets which is 5, or opt out of connection pooling altogether with pool: false in the request() config.

mscdex
  • 104,356
  • 15
  • 192
  • 153
  • Is that going to fix the root cause, or just cover it up? It seems to me like those sockets should be closed and put back in the pool. – GJK Sep 11 '14 at 00:12
  • They're not closed right away, the sockets use http keepalive for any future requests to the same host/port. – mscdex Sep 11 '14 at 00:18
  • OK, that makes sense. Would it also make sense to decrease the keep-alive time? Also, should I increase the pool size or just not use the pool? (I'm thinking the former.) – GJK Sep 11 '14 at 00:21
  • It's up to you, but in node v0.12+, `maxSockets` in Http.Agent defaults to Infinity instead of 5. – mscdex Sep 11 '14 at 01:52
  • OK, I'll switch pooling off then. I'm also going to lower the keep-alive time a bit, as seen [here](http://stackoverflow.com/questions/12651466/how-to-set-the-http-keep-alive-timeout-in-a-nodejs-server). Thanks for your help. – GJK Sep 11 '14 at 11:31