Skip to content

Add ability to cancel default response from proxyRes event handlers. #737

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 4 commits into
base: master
Choose a base branch
from

Conversation

erupenkman
Copy link

I'm using this to enable following redirects from my application logic, added a simple example of how I'm using it.

Before this change, it would ALWAYS try to pipe proxyRes into res, even if it was a 301 redirect etc.

…ng the response themselves. Added an example of using this to enable following redirects.
@jcrugzz
Copy link
Contributor

jcrugzz commented Nov 15, 2014

@erupenkman hmm, this is something I'd consider. You mind adding a test? @indexzero you have any thoughts on this?

@indexzero
Copy link
Member

@jcrugzz yes, tests are definitely a must. There are a lot of error cases in this scenario that could leak fds I think.

@erupenkman
Copy link
Author

hmm I tried to expand my use case (following 301 redirects) and i'm not sure that this is even the correct approach for my need now... I will add a test as another pull request later today though.

@DesignByOnyx
Copy link

I am implementing a very similar feature, but it adds a very crucial ability to allow the outside world to pipe onto the response stream and return that new stream. This would allow users to easily modify the response from the origin server if they so choose. It would look something like this:

    proxyReq.on('response', function(proxyRes) {
      var doneCalled = false,
          done = function (newStream) {
            doneCalled = true;
            newStream = newStream || proxyRes;
            for(var i=0; i < web_o.length; i++) {
              if(web_o[i](req, res, proxyRes, options)) { break; }
            }

            // Allow us to listen when the proxy has completed
            newStream.on('end', function () {
              server.emit('end', req, res, proxyRes);
            });

            newStream.pipe(res);
          };

      if(server) {
        server.emit('proxyRes', proxyRes, req, res, done);
        if (!doneCalled) done();
      }
    });
server.on("proxyRes", function (proxyRes, req, res, done) {
    var newStream = proxyRes.pipe(/*custom stuff here*/);
    done(newStream);
});

@cmawhorter
Copy link

i'm in need of this myself. i wish there were a way to just do something like this below.

didn't dive too deeply into the code but it looks like by the time proxyRes emits res is already sent/sending? (headers at least)

if proxyRes events fired sync before res sending started it should be possible to do this without leakage since the original proxyRes is torn down prior to creating the followup proxied request.

this would also address @DesignByOnyx 's usecase since you could abort the original proxyRes and then just pipe your stream directly to the original http res.

// tries one host for a route and falls back to a second if 404
proxy.on('proxyRes', function(proxyRes, req, res) {
  if (proxyRes.statusCode === 404 && req.proxyTarget === 'new.example.com') {
    proxyRes.abort(); 
   proxy.web(req, res, { target: 'http://old.example.com' });
  }
});

var server = http.createServer(function(req, res) {
  req.proxyTarget = 'new.example.com';
  proxy.web(req, res, { target: 'http://' + req.proxyTarget });
});

if i can't find a hackaround i might take a crack at it if that solution sounds acceptable in theory.

@DesignByOnyx
Copy link

I built my version of this and am actively using it on a project. It works quite well, but there is one caveat - I'll see if I can explain. The actual "piping" (eg. making the hose longer, if you will) happens synchronously, but the water flowing through the hose happens asynchronously. This is just normal "piping" kind of stuff, but the drawback is that as soon as the the final res is piped, the response headers are written - even if the water hasn't reached that portion of the hose yet (is this hose analogy working?). You can see my changes here: https://github.com/DesignByOnyx/node-http-proxy/tree/proxy-response-pipe.

Usage

server.on('proxyRes', function (proxyRes, req, res, callback) {
    // Calling "callback" is optional for backwards compatibility.
    // If you pipe onto proxyRes, you must send the new stream reference to callback
    var newStream = proxyRes.pipe( ... );
    callback(newStream);
});

The above example works well without issue assuming that your new pipe emits data events periodically and the length of the data doesn't change - the way streams should work. In my situation, however, I have a handful of responses that I need to modify and the end result may or may not be longer or shorter than the original. This requires that the "content-length" header gets updated as well. The problem I ran into that as soon as data started flowing through my custom section of the pipe, the response headers had already been written. The only way I could get around this was to overwrite res.writehead and call it when I emitted the "end" event. The only way to do this reliably was to buffer the entire response into memory (crimp the hose), make the changes I want, update the "content-length" header, and finally flush all the data down the pipe (release the crimp). I hope that helps. I have code I can share too - just let me know.

@maxcountryman
Copy link

Is this still under consideration? It would be quite nice to be able to hook into proxyRes and manipulate the stream before sending on to the client...

@mogsie
Copy link

mogsie commented Apr 27, 2018

This should be possible to implement using the selfHandleResponse: true option, then you take responsibility for fulfilling the response.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants