Over 14,000 people responded to this poll in which I asked “What do you think is reasonable page size to try and stay under for a modern web design?”
Here’s my simple attempt to make the results visual:
The most popular opinion was that 500k is a good size to stay under with 22% of the vote. Very few people responded saying over 1 MB. The heaviest three weights offered (1.5MB, 2MB, and over 2MB) combined only garnered 13% of the vote.
Another way to say it: 87% of people think websites page size should be 1MB or under.
For the record, CSS-Tricks (at the time of this writing, v9) is about 750k on a fresh load (average 3 seconds), and about 100k on cached reloads (average 1.5 seconds). Times based on my pretty average broadband connection. I’m fairly happy with that. Although I think it’s heavy for mobile and if I had more time I’d find ways to optimize that.
It goes without saying that the lower the page size, the better, as that’s less information that needs to travel the networks and your page will load faster. But of course how fast your page loads has many more factors than page size alone. Number of requests matter greatly, like reader basitian said:
I dont think its only a question about page weight. From the technical perspective you should at least take the weight AND number of requests into account.
How quickly your server can serve them and how fast the internet connection of your user of course matters too. Remember to think about how your personal internet speed rates compared to others worldwide (assuming worldwide is your audience, which is probably is if it’s a website). Jan-Marten de Boer says:
In most country areas in my country, a DSL speed of only 500kb/s is the norm
Earlier in this article I wrote that my broadband was “pretty average”. In running a quick speed test, I’m getting 21MB down and 6MB up. That’s 40 times faster than Jan-Marten. That’s an eye opener, eh?
How the page loads is another consideration. Read this comment by Mike Hopley in which he lays out four scenarios of how a 10 second page load might go down. There are good and bad ways, the best of which is where after the first second passes, the text content of the page is ready to consume, and further loading of the page doesn’t scramble things around. Jake Archibold puts it another way: “The time it takes for the page to load isn’t the real issue, it’s the time it take the page to deliver what the user wants.”
Lara Swanson of Dyn shares the same sentiment. She thinks 200k is a more reasonable goal and also that the document complete time is more important than the complete page.
I don’t really care about async code that renders after the document completes and I also won’t care about assets that don’t block a user from being able to read and use a page.
I’d say the best practice is just to do the best you can on all fronts and try to make them better any chance you get. Reduce the page size. Combine or reduce requests. Juice up your server and use a CDN. Make sure your loading process is clean without too much reflow. Lazy or asynchronously load as much as you can.
To wrap up, there are some interesting averages compile by Sreeram Ramachandran of Google in 2010 (4.2 billion sites analyzed):
44 requests per page
320k total page size
Full, real time results for this poll and all post polls are in the polls archive.