Brotli and Static Compression

Content compression can be as simple as flipping a switch, but there's a lot to consider beyond that. We pretty well know what we need to compress, but what about configuring compression? Or static versus dynamic compression? What about Brotli?

By now, Brotli enjoys support in a good chunk of browsers in use. While it provides performance advantages in many situations, there are some ins and outs that can prove challenging. At its highest compression setting, Brotli provides superior compression ratios to gzip, but the compression speed at this setting is slow enough that the benefit is squandered when content is dynamically compressed. What you really want in cases such as these is static compression. If you're unaware of the differences between static and dynamic compression, here's a quick refresher:

  • Dynamic compression occurs on-the-fly. The user makes a request, the content is compressed (as the user waits) and the compressed content is served.
  • Static compression is when assets are compressed on the disk ahead of the user's request. When the user requests an asset, compression doesn't happen. The pre-compressed asset is simply served from the disk.

The big issue with dynamic compression is that the server can't reply to the pending content request until the compression is done. This is no big deal at default compression levels. If you're cranking up the compression level to shrink assets as much as possible, though, it can hold up the show while the server waits for the compressor to finish. Even if you realize significantly lower file sizes, the delay of dynamic compression may end up being a performance liability.

The answer to this problem is, predictably, static compression, a concept which is no stranger to tech bloggers. In this short article, you'll get to learn about setting up your site to statically compress files for optimal compression performance, and see real world results of this powerful technique.

How do I statically compress content?

How you use static compression depends on which web server you use. As this blogger points out, Nginx has static compression capability for Brotli is built right in. If you use Express, the shrink-ray node module will provide this benefit through its own caching mechanism.

With other servers like Apache, however, it may not be so simple. Apache's unofficial mod_brotli module (and even its official mod_deflate module for gzip) doesn't provide static compression functionality. You don't necessarily need a server module to accomplish this goal, though. You can statically compress assets on the disk beforehand, and then configure the server to serve those pre-compressed assets from the disk using mod_rewrite.

So what do you use to pre-compress assets? You could manually do it using a binary in bash, but automating that work with gulp is much more convenient. Let's say you want to pre-compress all HTML, CSS, JavaScript and SVG images in a project and spit them out into a different folder:

const brotliCompress = () => {
    let src  = "src/**/*.{html,js,css,svg}",
        dest = "dist";

    return gulp.src(src)
        .pipe(brotli.compress({
            extension: "br",
            quality: 11
        }))
        .pipe(gulp.dest(dest));
};

exports.brotliCompress = brotliCompress;

The brotliCompress task is then invoked like this:

gulp brotliCompress

This will process all the assets matched by the file glob (specified in the src variable) and output Brotli compressed versions to the destination directory (specified in the dest variable). styles.css will be compressed to styles.css.br, scripts.js will be scripts.js.br and et cetera. Best of all, thequalitysetting of11yields the best possible compression ratio. It's also possible for you to generate pre-compressed gzip assets to serve to browsers that don't support Brotli withgulp-gzip. Its syntax is largely similar togulp-brotli, and you can use alevelsetting of9` to max out your gains from that compression method as well.

So what's the next piece? This is where a bit of Apache configuration knowledge comes in handy. This blogger's technique works magnificently:

# Specify Brotli-encoded assets
<Files *.js.br>
    AddType "text/javascript" .br
    AddEncoding br .br
</Files>
<Files *.css.br>
    AddType "text/css" .br
    AddEncoding br .br
</Files>
<Files *.svg.br>
    AddType "image/svg+xml" .br
    AddEncoding br .br
</Files>
<Files *.html.br>
    AddType "text/html" .br
    AddEncoding br .br
</Files>

You can also specify gzip-encoded versions for those browsers that can't understand Brotli encoding:

# Specify gzip-encoded assets
<Files *.js.gz>
    AddType "text/javascript" .gz
    AddEncoding gz .gz
</Files>
<Files *.css.gz>
    AddType "text/css" .gz
    AddEncoding gz .gz
</Files>
<Files *.svg.gz>
    AddType "image/svg+xml" .gz
    AddEncoding gz .gz
</Files>
<Files *.html.gz>
    AddType "text/html" .gz
    AddEncoding gz .gz
</Files>

From here, you need a couple mod_rewrite rules to detect what encodings are available in the browser's Accept-Encoding request header and then serve the appropriately encoded asset to the user:

# Turn on mod_rewrite
RewriteEngine On

# Serve pre-compressed Brotli assets
RewriteCond %{HTTP:Accept-Encoding} br
RewriteCond %{REQUEST_FILENAME}.br -f
RewriteRule ^(.*)$ $1.br [L]

# Serve pre-compressed gzip assets
RewriteCond %{HTTP:Accept-Encoding} gzip
RewriteCond %{REQUEST_FILENAME}.gz -f
RewriteRule ^(.*)$ $1.gz [L]

With these rules, the browser will serve pre-compressed Brotli content to browsers that specify it in their Accept-Encoding request headers. Other browsers will get statically compressed gzip versions. That's it. You just learned how to serve statically compressed assets to users in a browser that doesn't support it.

How does this affect performance?

As you might guess, taking the cost of on-the-fly compression out of the equation confers a performance benefit for the user. To test this out, I deployed this change to a client's static site and ran some tests. This site was roughly ~900 KB in total size with a number of stylesheets and scripts (including a sizable CSS/JS framework), some SVG images and some decently sized HTML. Using sitespeed.io, I ran 50 iterations on each of four scenarios:

  1. Dynamic Brotli compression with a quality setting of 11.
  2. Static Brotli compression (with the same quality setting).
  3. Dynamic gzip compression with a level setting of 9.
  4. Static gzip compression (with the same level setting).

The effects on back end time were quite noticeable:

Comparison of back end times of various compression method

Here we see that dynamic Brotli compression at the highest compression level is very slow, (which has been noted in a few write-ups). When we pre-compress assets with Brotli, though, we're getting the full benefit of the smallest possible file sizes, but without the penalty of dynamic compression that comes at highest level. The differences between dynamic and static compression using gzip are less dramatic, but notable. On the average, static compression is beneficial to reducing back end time. If it's one thing we know, it's this: If we can reduce back end time, we reduce almost every metric that occurs after it. Reduce back end time where you can, and you'll improve overall responsiveness.

Caveats and Conclusion

Static compression is ideal, but it's not for every situation. For example, what about dynamic content? Any page generated by a back end language (e.g., PHP or C#) falls under this category. That kind of content is not a candidate for static compression. In this case, you have to accept that static compression just isn't an option. What you can do, however, is use a sensible dynamic compression configuration for compressing dynamic content. Using Brotli or gzip default compression levels should yield a performance benefit without adversely affecting your site's time to first byte.

What about BREACH attacks? Compressed content is vulnerable to this exploit over HTTPS, but the key here is that the BREACH exploit is only problematic for content containing personally identifying information. In this case, you might stick with leaving your HTML content uncompressed, but compress other kinds of content that don't contain sensitive information. It's a workable compromise, but there are other mitigation methods out there.

Beyond these scenarios (and potentially the upkeep of compressed assets in your workflow), there's very little reason not to adopt static compression. Even if your web server of choice doesn't support it, it's still feasible to implement. Give your site a boost and try it out today!


Cover of Web Performance in Action

Jeremy Wagner is the author of Web Performance in Action, an upcoming title from Manning Publications. Use coupon code sswagner to save 42%.

Check him out on Twitter: @malchata