Alex Russell, describing a prototype of “Never-Slow Mode” in Chrome:
… blocks large scripts, sets budgets for certain resource types (script, font, css, images), turns off document.write(), clobbers sync XHR, enables client-hints pervasively, and buffers resources without Content-Length set.
Craig Hockenberry, posting an idea to the WebKit bug tracker:
The situation I’m envisioning is that a site can show me any advertising they want as long as they keep the overall size under a fixed amount, say one megabyte per page. If they work hard to make their site efficient, I’m happy to provide my eyeballs.
Sometimes name-and-shame is an effective tactic to spark change.
Addy Osmani writes about an ESLint rule that prohibits particular packages, of which you could use to prevent usage of known-to-be-huge packages. So if someone tries to load the entirety of lodash or moment.js, it can be stopped at the linting level.
The truth universally ignored is that latency kills performance long long way before bandwidth does.
Wanna proof? If bandwidth mattered, YouTube, Netflix would not even exist.
There is ABSOLUTELY NOTHING demanding more than 1 HTTP request PER PAGE.
What? But if network was 20 times faster, eould this post still be relevant? I think network is still the culprit. I should be able to download 1gb a second. Why not?
Don’t blame JS for a development issue. JS is the best has happened to web development. How about the hundreds of online courses saying you could be a web developer in less than 2 months?. Today we have nice browsers, powerful processors and cheap RAM, we do need also nice web apps. Web apps are the cheapest and more effective way a company can develop a multiplatform information system. AJAX, charts or DOM manipulation are necessary to offer functional and beautiful UX. And we need JS for that.