When I was working at my first “real” job in the field in the mid-2000s, it was hammered in the web dev field to build tiny websites (no more than 100KB per page), only use JavaScript for special effects, and make sure everything—from images to Flash content—has a fallback so that JavaScript features progressively enhance. If there was no JavaScript, the site still works 100%, only not as fancy.
The reason for this advice was simple: in the early days of the web, everyone was limited to a dial-up Internet connection which was really slow. Shoot, it took a few minutes just to connect to the Internet, let alone access a website. Hit the “Connect” button and go make some coffee or grab a smoke because it is literally going to take a few minutes. So, in that sense, it’s perfectly reasonable that the key principles of building a good website in the mid-2000s involved making the site small with solid fallbacks for content.
I miss those days a lot. In my opinion, those constraints made the web better. But, as time wears on and the service provider technology continues to improve in the U.S. with broadband (and eventually fiber), the constraints are no longer viewed as an issue anymore.
Today, the rules of web development are completely different. It’s more about the developer experience and less about the user experience: build processes, deciding which framework and tech stack is used, and figuring out where the site lands in Google search results. Sadly, gatekeeping (i.e. you’re not a real “x” if you don’t “y”) and framework battles have replaced the “how to make this cool thing without JavaScript” conversations. I really don’t pay attention to this stuff because, at the end of the day, it all renders down to HTML, CSS, and JavaScript in the browser; use whatever works for you.

What does bother me, though, is the fact that huge sites that require JavaScript just to use have become the accepted norm. Current stats show websites are weighing in at an average of—big gulp—2MB per page?! And, if you do not have JavaScript enabled, be prepared for the blizzard (that’s what I call sites that only display a white screen).
JavaScript has become a third pillar of the web. I know JavaScript is super useful, so why am I picking on it? I’m picking on the fact that a lot of sites can’t even load if the JavaScript is not there. Since when did loading HTML and CSS rely on JavaScript?
I recently moved to a rural area on the outskirts of a major city, and I’m reminded of those early web dev days because, once again, I have a horrible Internet connection on my mobile device. The broadband connection is okay, but when I’m outside or the power goes out, I’m left with the same old slow experience of the dial-up days. When I’m browsing the web on my mobile device, I live for Reader Mode and I have to turn off things like JavaScript (and most images, compliments of JavaScript lazy loading) because it is too much to download and run. But, just by switching off JavaScript, a lot of sites won’t even load. The white screen of death is not the result of my phone dying; it’s my access to the Internet. And this white screen appears on site after site, hence the blizzard.

For the most part, I can get by using my WiFi in the house to do necessary Internet things, like working, shopping, and paying bills. But, late this summer, my electricity went out. So, I went to the electric company’s website on my mobile device to see when service might be back on because the entire house runs on electricity and I need to know if we need to make arrangements for things, like food and water. But, the electric company’s mobile website is large (3MB transferred and 8.6MB in resources — ouch) and won’t load (even with JavaScript enabled).
Angry, I went to Twitter and posted my outrage.
I got some pretty awesome responses showing me that some sites actually do handle constraints really well, like traintimes.org.uk and the text-only version of NPR. YES! Why don’t they provide a text-only version of the outage page so when people actually need that page of the site, they can use it even when the conditions are the worst?
My power-outage scenario is a one-off type of situation, but there are people across the U.S. and the entire world who live with unreliable Internet because there is no other option available to them. NPR published an article covering the struggles of trying to teach students during the pandemic in areas where communities have unreliable Internet and it’s hard to believe that this is the current state of Internet availability in any part of the U.S. But, the fact remains that it is.
It’s clear a lot of people would benefit from the constraints of the early days of web development. The beauty of websites built before broadband networks were available (like the infamous Space Jam site from 1996) is that they can load on any kind of device from anywhere under almost any circumstances because they were built with these constraints in mind. Hopefully, all developers start employing an adaptive loading strategy that provide solutions for users on slow networks or low-end devices.
Yes! I wish this was still a consideration.
Where I work the website was developed by an external company for the marketing team and weighs in at a ridiculously bloated:
107 requests,
7.3MB transferred,
16.8 seconds on my normal broadband connection.
Amen. I remember when a recommended size for a web page, including assets, was about 35KB, maybe 50KB for a home page. Most JavaScript frameworks are bigger than that. Heck, I’ve seen icons that are bigger than that.
This might be of interest – even without unusual weather or disasters, not everyone has a fast, cheap connection: https://simonhearne.com/2021/inclusive-design/
I was also reading recently about people in the aftermath of disasters having very little bandwidth and a pressing need for information. I think the NPR and CNN sites were mentioned in it. Sadly, I can’t find the link.
The other thing about such slimmed down sites is that they’re lightning fast if you do have bandwidth, and typically very amenable to caching on CDNs, etc.. A 2MB page of bloat still takes time to arrive (and then, just when you think you’ve finished loading, all the GTM scripts, analytics and crud that websites are all plastered with arrives too). And ultimately someone is paying for that bandwidth.
For years I’ve been thinking about how much bandwidth we’ve got compared to 25 years ago – but our pages aren’t any quicker. Rather, we’ve bloated them with video, inefficient images, a tonne of CSS, more JavaScript frameworks than you could reasonably need, and bloody marketing analytics.
Sure, some of that stuff can be really useful; web applications can do so much more, be so much richer – but it’s not mandatory, and some applications just need to be fast, light, efficient.
Interesting read. While I didn’t write code back in the day, I strongly believe in function over fashion. Having a page where the content doesn’t even load is neither.