Exploiting the new requestStream property with Express Gateway | Express Gateway

Blog

Exploiting the new requestStream property with Express Gateway

We’ve recently released Express Gateway 1.10, which, besides bug fixes, contains some interesting new features. Today, we’ll be focusing on the new requestStream property added to the plugin framework.

What’s the requestStream property about

When a request is processed by Express Gateway, it gets enriched by the system with a bag object called egContext. Such objects contains specific informations that Express Gateway knows about your system that you can leverage to have better introspection capabilities. For example, every request has the apiEndpoint name that triggered the Gateway, as well as the required scopes.

Starting with 1.10, the egContext object now contains a new property called requestStream. This property is a stream that, if provided, will be forwarded to the selected serviceEndpoint as the request body instead of the original payload that the canonical request contains.

What can we do with that?

Most of the times, tampering the user payload is something we do not want to do. However, there are some cases where we might want to do that in a controlled way.

In this article, we’ll see how we can use the new requestStream property to parse a JSON or URL encoded request body and restream it back so that’s avaiable for other policies for processing.

The problem

Usually the request body is simply streamed as a byte array from the source client to the destination endpoint. This is done to minimize the gateway’s memory usage as well as improve performances.

However, let’s say we want to save the request body for logging or introspection purposes. In such case, we need to store it entirely in memory before starting to operate on it.

Express offers already 2 body parser implementations that can do that for us; therefore the code for an hypothetical plugin implementing such policy could be something like this:

const jsonParser = require('express').json();
const urlEncodedParser = require('express').urlEncoded({ extended: true });

const policy = actionParams => (req, res, next) =>
  jsonParser(req, res, () => urlEncoded(req, res, next));

We can then install such policy in any of our pipelines and, in theory, the subsequent policies should have access to the req.body property.

    policies:
      - body-parser:
      - log:
        - condition:
            name: expression
            expression: "req.body.start > 5"
          action:
            message: '"Current user is approaching limits"'
      - proxy:
          - action:
              serviceEndpoint: backend

In this case, we’re using the parsed body to test a conditions that’ll trigger a log policy, if the condition holds true.

Now, we can try to shoot a request to the Express Gateway and see what happens:

curl -X POST -h "Content-Type: application/json" -d '{"limit":4, "name":"Clark"}' http://localhost:8080
…
…
…
Request Timeout

The request doesn’t go through and we never receive a response back. Why’s that happening?

When the bodyParser policy is parsing the body, it’s basically reading through the request stream and, once it’s over, parses the body using the appropriate function.

While doing this operation, it’s consuming the request stream and, once its end has been reached, its content is kind of lost and cannot be rewrapped back.

Now what’s happening is that we’re sending to the target serviceEndpoint a Content-Lenght header that’s claiming a certain amount of bytes are going to be sent — but we are not. That’s why the WebServer on the receiving side will be basically waiting forever for a body that will never arrive — thous having a timeout in the request.

Fortunately for us, request streams can be piped — which means that fundamentally it’s content can be streamed to multiple destinations at the same time.

In our case, we’ll leverage such feature to pipe the original stream into an in-memory copy stream that then we’ll send out using the requestStream property.

const { PassThrough } = require("stream");
const jsonParser = require("express").json();
const urlEncodedParser = require("express").urlEncoded({ extended: true });

const policy = actionParams => {
  return (req, res, next) => {
    req.egContext.requestStream = new PassThrough();
    req.pipe(req.egContext.requestStream);

    return jsonParser(req, res, () => urlEncoded(req, res, next));
  };
};

The flow here is:

  • The request comes in the gateway
  • The body parsers (json and urlencoded) start to read the stream gradually until it reaches the end
  • Given we piped the request stream to the requestStream property, every read gets hold into the PassThrough stream
  • The gateway, when sending out the request, will prefer the requestStream instead of the original req one. In this case, its content is basically the same of the original request — we just created a new copy of it.
  • Request goes out, and we receive back a response

Conclusions

In this small article we saw how the newly introduced requestStream property can be used to restream back a content that we parsed for our own usages.

However this is not the only usage. We’ll be exploring another interesting use case where the requestStream property can be leveraged.

More Resources