Most of the web really sucks if you have a slow connection

Dan Luu on the sorry state of web performance:’s not just nerds like me who care about web performance. In the U.S., AOL alone had over 2 million dialup users in 2015. Outside of the U.S., there are even more people with slow connections.

This other note is also interesting, and I think that Dan is talking about Youtube’s project “Feather” here:

When I was at Google, someone told me a story about a time that “they” completed a big optimization push only to find that measured page load times increased. When they dug into the data, they found that the reason load times had increased was that they got a lot more traffic from Africa after doing the optimizations. The team’s product went from being unusable for people with slow connections to usable, which caused so many users with slow connections to start using the product that load times actually increased.

Performance Under Pressure

Here's a neat transcript of a talk by Mat Marquis where he details how he made the Bocoup website lightning fast, particularly with snazzy font loading tricks and performance tools to help monitor those improvements over time.

Although, my favorite part of the talk is when Mat goes into why he wants to make websites:

I don't get excited about frameworks or languages—I get excited about potential; about playing my part in building a more inclusive web.

I care about making something that works well for someone that has only ever known the web by way of a five-year-old Android device, because that's what they have—someone who might feel like they're being left behind by the web a little more every day. I want to build something better for them.

This browser tweak saved 60% of requests to Facebook

Ben Maurer & Nate Schloss:

The browser's reload button exists to allow the user to get an updated version of the current page. In order to meet this goal, when you reload, browsers revalidate the page that you are currently on, even if that page hasn't expired yet. However, they also go a step further and revalidate all sub-resources on the page — things like images and JavaScript files.

So even if you've set proper expires headers for resources, hitting that reload button (which people must do a ton at Facebook) still requires server round trips to revalidate assets (sometimes).

They worked with Chrome:

After fixing this, Chrome went from having 63% of its requests being conditional to 24% of them being conditional.

And Firefox:

Firefox implemented a proposal from one of our engineers to add a new cache-control header for some resources in order to tell the browser that this resource should never be revalidated.

So if you're using URLs for assets that never change (if they change, they'll be at a new URL) in Chrome you'll benefit automatically, and in Firefox you should use their new header.

Understanding the Critical Rendering Path

Ire Aderinokun:

There are 6 stages to the CRP -

  1. Constructing the DOM Tree
  2. Constructing the CSSOM Tree
  3. Running JavaScript
  4. Creating the Render Tree
  5. Generating the Layout
  6. Painting

I imagine if you're really getting into performance work, you'll want a firm understanding of this. There are lots of ways to block/delay parts of this process. The job of a perf nerd is to understand when and why that's happening, evaluate if it's necessary or not, and tweak things to get to that painting step as soon as possible.

I'm curious if this is generic enough that 100% of all rendering engines work 100% the same way, or if there are significant differences.

#152: Font Loading with Zach Leatherman

Time for another pairing screencast! This time I have Zach Leatherman on, from Filament Group. Zach has done a lot of research, writing, and speaking about web font loading the past few years. He's got a comprehensive guide on it!

There are some problems with the default way that custom fonts are loaded. As in, just linking up a font in some CSS through @font-face. Even the way Google Fonts provides you to use their fonts Zach calls an anti-pattern (ultimately it's just vanilla @font-face). Different browsers do different things with @font-face. For example, some versions of Safari make type set in a custom font invisible until the font loads, but has no timeout, so if the font fails for any reason, you could be in the ultimate worst-case scenario: forever-invisible text on the site.

You don't have a heck of a lot of control over how @font-face fonts load, unless you take matters into your own hands. That means things like: inlining the font, subsetting the font (either with a "critical" set of glyphs, or at least reducing glyphs to the language in use), using font loaders to determine when the fonts load and altering classes to use them. That last one is particularly interesting. When exerting control over font loading, you not only have to deal with when/how the browser loads the CSS that contains the @font-face, but also when/how the browser comes across text elements that are told to use those fonts. Fonts that aren't used aren't downloaded. So sometimes the procedure is to force them to download, wait for them to download, then apply classes to actually use them.

Some tools we looked at:

  • Firefox DevTools was better for looking at fonts in use
  • Subsetting fonts can be done in the Font Squirrel generator or Font Prep.
  • What glyphs do you subset? Test what you need at certain URLs with Glyphhanger.
  • How do you tell when the browser is using faux bold/italic? faux-pas.

A practical guide to Progressive Web Apps for organisations who don’t know anything about Progressive Web Apps

Sally Jenkinson:

Progressive Web Apps (sometimes referred to as PWAs, because everything in tech needs an acronym) is the encapsulating term for websites following a certain approach, that meet particular technical criteria. The "app" involvement in the name isn’t an accident – these creations share much of the functionality that you’ll find in native experiences – but really, they're just websites.

It's like if you build a website that is so damn good, you get to have a home screen icon on mobile devices. And good is defined by performance and progressive enhancement.

When you hear people say "I want the web to win" they typically mean "I don't want to lose the web to proprietary app development". PWAs seem like an early step toward making web apps not second-class citizens on mobile devices. Maybe there is a future where native app development is web development.

Free, faster.

Ethan Marcotte, on time- and budget-constrained organizations websites:

Between the urgency of their work and the size of their resources, spending months on a full redesign isn’t something they can afford to do. Given that, a free theme for, say, WordPress can yield a considerable amount of value, especially to budget-constrained organizations. They can launch their redesign more quickly, and continue reaching the people who need their information most.

So Ethan takes a look at a bunch of free themes, so at least a responsible choice can be made there, and finds

the results were surprising: on a 3G connection, the slower themes I tested took anywhere from 45-90 seconds for any content to appear. In other words, the pages took roughly a minute before they were usable.

Pretty rough.

What I find particularly scary is that these are just empty themes. I usually attribute the slowness of sites in this category (off the shelf, slap-a-CMS on it) to be what happens on top of the theme. Stuff like uploading too many/too large of images and installing a million plugins that load their own set of resources.

I think it shows off some recent technology in a new light: saving us from ourselves. HTTP/2 makes concatenating resources less important, and that's saving us from ourselves and those million plugins individual CSS and JavaScript files. WordPress does responsive images by default now, that's saving us from ourselves and ensuring we aren't loading more image than we need. AMP, as a technology, is saying y'all have lost the plot here and we need to save you from yourselves.

Modernizing our Progressive Enhancement Delivery

Scott Jehl, explaining one of the performance improvements he made to the Filament Group site:

Inlining is a measurably-worthwhile workaround, but it's still a workaround. Fortunately, HTTP/2's Server Push feature brings the performance benefits of inlining without sacrificing cacheability for each file. With Server Push, we can respond to requests for a particular file by immediately sending additional files we know that file depends upon. In other words, the server can respond to a request for `index.html` with `index.html`, `css/site.css`, and `js/site.js`!

Server push seems like one of those big-win things that really incentivize the switch to H2. We have an article about being extra careful about caching and server push.

The “Optimal Image Format” for Each Browser

Perhaps you've heard about the WebP image format? And how it's a pretty good performance win, for the browsers that support it? Well that's only for Blink-based browsers, at the moment. Estelle Weyl's article Image Optimization explains the best image format for each browser:

Browser Optimal image format
Chrome WebP
IE 9+ / Edge JPEG-XR
Opera WebP
Safari JPEG-2000

And you can serve these formats through the <picture><source type=""> syntax.

Couple that complexity with the complexity of responsive images, and it really seems like outsourcing image delivery to a dedicated service seems like the way to go. At least above a certain scale.

Front-End Performance Checklist 2017

Vitaly Friedman's list includes a "Quick Wins" section with the web performance things that can't be ignored. If you aren't setting caching headers on assets, optimizing images, and gzipping, you're leaving some huge and easy performance gains on the table. After you've covered those, then you can dig into Brotli, OSCP, tree-shaking, and whatnot.

Speaking of which, I should really look into Brotli, OSCP, tree-shaking and whatnot.