The Space Jam website has been a time capsule. You’d head to spacejam.com suddenly be teleported back to 1996; a time before CSS, a time when using
<table> elements to make grids and layouts on a page was cool, a time before DevTools and Firebug showed us what the heck we were even building.
The Space Jam website is special because it’s a piece of web design history. It shows us how far we’ve come.
But if you head to the site today, it now shows a completely redesigned website for the new movie starring LeBron James. You can still access the old site by clicking a logo in the top right which is neat but there’s something a tiny bit sad to me that the old website is no longer attached to this URL anymore.
While I was lamenting that URL changing and being lost to the sands of time, Max Böck had other concerns. He wrote this great piece about performance and compared the original 1996 website with this new version. After running both sites through WebPageTest, Max comes to this conclusion:
[…] after 25 years of technological progress, after bringing 4.7 billion people in the world online, after we just landed a fifth robot on Mars, visiting the Space Jam website is now 1.3 seconds faster. That seems…underwhelming.
So if the old Space Jam website shows us how far we’ve come, then the replacement shows us how far we still have to go.
And despite all the tools we have now — not to mention the conferences, books, and websites that are dedicated to the subject — we’ve sort of stagnated. Why after all these years are we stuck with painfully slow websites? Why haven’t things improved much at all and, in many cases, actually gotten a whole lot worse?
There’s probably a lot of reasons, but I think it’s ultimately a cultural problem. Many folks, and not all of them are developers, tend to believe performance is a nice-to-have, an additional feature we can get to later instead of it being baked into our work day-to-day.
Kealan Parr wrote about bad web performance and how to improve it the other day. He argued that a slow website isn’t just an annoying experience for users but a huge detriment to a business as well. Or, to put it another way, bad performance is bad for business:
[…] Firefox made their webpages load 2.2 seconds faster on average and it drove 60 million more Firefox downloads per year. Speed is also something Google considers when ranking your website placement on mobile. Having a slow site might leave you on page 452 of search results, regardless of any other metric.
How do we make fast websites though? Kealan explains:
Here’s the thing: performance is more than a one-off task. It’s inherently tied to everything we build and develop. So, while it’s tempting to solve everything in one fell swoop, the best approach to improving performance might be an iterative one. Determine if there’s any low-hanging fruit, and figure out what might be bigger or long-term efforts. In other words, incremental improvements are a great way to score performance wins. Again, every millisecond counts.
Incremental wins, I like that. But before we dive in and try and make our websites fast, we need to understand why things are slow in the first place. That’s why I enjoyed this other performance-related blog post by Jake Archibald where he looks into who has the fastest F1 racing website in 2021.
What’s great about this post is that all the recommendations that Jake suggests are tiny improvements that would shave off whole seconds from each website. And, thankfully, these are all small enough that they don’t require restructuring your whole organization.
Also, there’s so many great bits of advice that can be applied to the websites that we’re all building right now, like this one:
[…] it’s important to avoid hosting render-blocking content on other servers.
That’s always a good reminder and it’s why I now tend to avoid hosting images, fonts, and CSS on other people’s servers.
Speaking of handy advice, I thoroughly enjoyed this Devs Answer interview with Chris where he talks about web development career advice (write write write!) and about the anxiety you might have when it comes to bleeding edge technologies of the web (don’t worry about it!).
Notes on Building Dark Mode
The other week, I wrote about building dark mode at Sentry and all the problems we encountered along the way. But isn’t dark mode just a fad? isn’t it just a thing that makes developers like us feel cool?
Well, I think there’s more to it than that. The process of building dark mode alleviates a ton of design systems problems along the way:
For most front-end codebases, the design of your color system shows you where your radioactive styles are. It shows you how things are tied together, and what depends on what. Sure, we wanted dark mode to look great. But we also wanted to make sure that dark mode doesn’t slow us down by introducing even more problems than we already have.
And I think that’s what our team achieved here. We made our designs more consistent, buried those radioactive styles, made relationships between colors, and hopefully slightly improved the way we build front-end components moving forward.
On a similar note, I spotted this experimental and fun take on dark mode by Nikita Tonsky on his blog where it’s like a light has been turned off and your cursor acts as a lamp in the dark:
Swipey Image Grid
Cassie makes a great point here when it comes to SVG animations and how you can animate a lot more than just paths and vectors:
When someone says ‘SVG animation’, what do you picture? From conversations at my workshops I’ve noticed that most people think of illustrative animation. But SVG can come in handy for so much more than jazzy graphics. When you stop looking at SVG purely as a format for illustrations and icons, it’s like a lightbulb goes on and a whole world of UI styling opens up.
Cassie then shows us how to use SVG to make a responsive animated grid of images like this:
It’s a pretty dang smart technique and now I’m wondering how else I might apply SVG animations.
Jamstack needs a serverless database! Fauna helps you simplify code, reduce costs, and ship faster, by delivering powerful database capabilities as a data API with support for GraphQL and custom business logic.
[Chris]: Adobe launched this thing called Super Resolution for Lightroom and Photoshop. Eric Chan:
The term “Super Resolution” refers to the process of improving the quality of a photo by boosting its apparent resolution. Enlarging a photo often produces blurry details, but Super Resolution has an ace up its sleeve — an advanced machine learning model trained on millions of photos. Backed by this vast training set, Super Resolution can intelligently enlarge photos while maintaining clean edges and preserving important details.
So you can essentially turn a low-res image into a higher-res image thanks to machine learning and fancy codin’. The obvious joke is that the “ZOOM IN! ENHANCE!” crime-show trope, which seemed like a complete impossibility not long ago, is now real. This isn’t the first of its kind. Google did it back in 2017 and I believe it was immediately used for human surveillance. Ugajdkf. Now it’s right in some of the most popular design software in the world, so that’s significant, although the tech was already kinda out.
Doesn’t it seem like if machine learning can fix things like the resolution of images, that it could help little ol’ HTML as well? One of the crappiest accessibility issues is when
<input>s are missing
<label>s or they aren’t connected probably. I’m glad Amber Wilson has written the canonical article on this now. I know that papering over accessibility issues is bad news, but doesn’t it seem like if you taught a web browser what 1,000,000 bad examples of web forms are and showed it 1,000,000 good examples, it could take a stab at properly connecting inputs and labels when it encounters a bad example? Maybe this isn’t a good idea. Maybe the incentive isn’t there to do it. I dunno, but I’d love to see more of the greatest minds of our generation on problems like this that could help people.