The reason for this advice was simple: in the early days of the web, everyone was limited to a dial-up Internet connection which was really slow. Shoot, it took a few minutes just to connect to the Internet, let alone access a website. Hit the “Connect” button and go make some coffee or grab a smoke because it is literally going to take a few minutes. So, in that sense, it’s perfectly reasonable that the key principles of building a good website in the mid-2000s involved making the site small with solid fallbacks for content.
I miss those days a lot. In my opinion, those constraints made the web better. But, as time wears on and the service provider technology continues to improve in the U.S. with broadband (and eventually fiber), the constraints are no longer viewed as an issue anymore.
Angry, I went to Twitter and posted my outrage.
I got some pretty awesome responses showing me that some sites actually do handle constraints really well, like traintimes.org.uk and the text-only version of NPR. YES! Why don’t they provide a text-only version of the outage page so when people actually need that page of the site, they can use it even when the conditions are the worst?
My power-outage scenario is a one-off type of situation, but there are people across the U.S. and the entire world who live with unreliable Internet because there is no other option available to them. NPR published an article covering the struggles of trying to teach students during the pandemic in areas where communities have unreliable Internet and it’s hard to believe that this is the current state of Internet availability in any part of the U.S. But, the fact remains that it is.
It’s clear a lot of people would benefit from the constraints of the early days of web development. The beauty of websites built before broadband networks were available (like the infamous Space Jam site from 1996) is that they can load on any kind of device from anywhere under almost any circumstances because they were built with these constraints in mind. Hopefully, all developers start employing an adaptive loading strategy that provide solutions for users on slow networks or low-end devices.
Yes! I wish this was still a consideration.
Where I work the website was developed by an external company for the marketing team and weighs in at a ridiculously bloated:
16.8 seconds on my normal broadband connection.
This might be of interest – even without unusual weather or disasters, not everyone has a fast, cheap connection: https://simonhearne.com/2021/inclusive-design/
I was also reading recently about people in the aftermath of disasters having very little bandwidth and a pressing need for information. I think the NPR and CNN sites were mentioned in it. Sadly, I can’t find the link.
The other thing about such slimmed down sites is that they’re lightning fast if you do have bandwidth, and typically very amenable to caching on CDNs, etc.. A 2MB page of bloat still takes time to arrive (and then, just when you think you’ve finished loading, all the GTM scripts, analytics and crud that websites are all plastered with arrives too). And ultimately someone is paying for that bandwidth.
Sure, some of that stuff can be really useful; web applications can do so much more, be so much richer – but it’s not mandatory, and some applications just need to be fast, light, efficient.
Interesting read. While I didn’t write code back in the day, I strongly believe in function over fashion. Having a page where the content doesn’t even load is neither.