The new poll is up (in the sidebar on the actual site, RSS folks) and it reads:
Would you rather host a 200k file on a major CDN or a 20k file self-hosted?
This requires a little explanation. Let’s say the file in question is jQuery UI. On your site, you only need it for it’s tabs functionality. You could load a copy of it from Google’s CDN like this:
<script src="http://ajax.googleapis.com/ajax/libs/jqueryui/1.8.14/jquery-ui.min.js"></script>
That file is literally 205k, because it’s the complete jQuery UI library. But, it’s on what is probably the world’s biggest/fastest/most-used CDN (Content Delivery Network). Lots of sites use jQuery UI and probably get it from there, so you have a decent chance of that file already being cached when a user gets to your site.
Or, you could download a customized copy of jQuery UI.
<script src="/js/jquery-ui-1.8.14.custom.min.js"></script>
This file could only be 20k, because it is trimmed down to only have the necessary stuff to make the tabs functionality work. That file is ten times smaller, but the chances of it being cached for a user coming to your site is nil if they are a first time visitor.
So all other things being equal, which do you choose?
I will choose a small file hosted on own server. The visitor has to load contents from our own server. So, loading a 20K file from the same server with todays internet connection will not affect in anyway.
You moron. Today’s internet connections have increased in bandwidth but latency hasn’t kept pace at all. The difference between loading an asset from your local cache and making an extra HTTP request is more a matter of latency than bandwidth. Using a CDN makes it much more likely to get a cache hit and not download the file at all.
I chose CDN.
200K already in user cache beat any http request, I dont care even if the file is just 1 byte big.
Adding a CDN will bring in another DNS lookup time. :-(
DNS lookups are cached too though, and the chances are ajax.googleapis.com will already be cached.
You are using DNS Manual Prefetching, aren’t you? Aren’t you?
James, I don’t think you understand. Prefetching is NOT caching.
I suggest you read a book or 2 before citing buzzwords like some kind of moronic HR Consultant.
Jon, I don’t think YOU understand.
I was referring to the comment ‘Adding a CDN will bring in another DNS lookup time’. If you DNS Manual Prefetch, it will prefetch the DNS of the CDN when the HTML loads and not wait for the subsequent JS calls at the bottom of your page… Then when the JavaScript library is finally requested, the DNS will already be cached in the browser….
If you honestly think that I thought DNS Manual Prefetching actually cached the JavaScript, maybe you should heed your own advice.
Nice try though!
I choose CDN too, cache wins all the time :D
I prefer CDN with fallbacks in case of availability issues.
Technical/security issues aside, I wonder if one day these libraries could be available via the browser itself… but this might cause its own pitfalls such as outdated browser using outdated library etc. CDN is the best option currently.
This is actually a great idea. I would take it a bit further and say the browser should both come with the library stocked, in the 3 most recent versions, and have an auto-update process that periodically checks for new versions of the framework, pulls it down and pre-caches the file for the user.
yep, in html5 boilerplate, you can see a call to jquery CDN with a local fallback
I choose CDN too and use this same method.
If there was something something other than document.write to handle this it would be a good idea, but the reason we got away from it in the first place is because it’s such a resource hog. document.write is never a good idea; we use the google cdn for speed and falling back to something that’s so slow is completely counter productive.
good idea… btw HTML5 Boilerplate comes with this too… i think is an amazing tool
HTML5 Boilerplate is a piece of shit.
Can this technique be modified to work on wordpress?
I choose CDN in first place, but with a local fallback. Could it maybe be another option in the poll?
Same here. A couple days ago I submitted an enhancement request with WordPress to include an optional parameter with wp_register_script to assign a local fallback when loading a CDN script.
All sites that use certain cdn services load like mud in my country because it seems that certain IPs are blocked (censored). Whenever I get to a site that takes 5 minutes or more to load (without eitehr css or js) I just know it is a cdn problem.
I don’t think the actual cdn service is blocked, just certain Ips that are associated with it for some reason unknown to me. And basically this is annoying but it also taught me something. Do I want my visitors (who use an avarage of 2Mbit ADSL connection) to wait 2 seconds more for some file to load or run the risk that my site breaks for whatever reason because I have no control over the content?
So I rather leave all files on my own server.
Absolutely correct.
And I really hate sites that pull from all over the place. I use NoScript. I like it when all the content comes from one location.
This has been my thinking for a while. I like the *idea* of CDN, but I think it’s introducing an unnecessary stability risk. And I know everyone likes to think that Google is impenetrable, but they will probably successfully be hacked one day. When that happens, and someone injects malicious password hacking code into the distributed copy of jQuery, boom goes the dynamite.
CDN for me too, they are generally faster for my purposes and allow for always up to date release usage which is always nice when you release software to the public.
Automatically using the most up to date version of a library, leaves you having to scramble to keep up when the API updates. I would much rather get to test the latest version in a dev/qa/staging environment first before accepting the development teams word that it is “backwards compatible”
Definitely the 20k file.
CDN all day everyday… it saves me bandwidth and the chances are high that is already cached anyway.
Less file size – less code for browser to parse and execute. So I definitely choose local small file.
But the size really matters. If the difference is 10 times as you say, then it’s worth it.
I agree. I do not want to load unnecessary libraries, css, js… the traffic is not a problem, but client memory, browser parsing is important.
I prefere optimized code instead “flexible” code which loads bunch of unused code.
Small local file, so far.
The reason behind the decision is that the probability of having the CDN file cached is IMHO very low – the default browser cache sizes are REALLY small (50Mb on Firefox, automatic in IE8 – but hovers at 50-250Mb, 20Mb in Opera) – and that needs to include all the Youtube SWFs, images, CSS and quite often HTML for all the sites that the user visits. So, unless you’re using the same libraries AND the same CDN AND the same version library version as some of the really popular sites (e.g. Facebook, NYTimes) – the chances are that the file is not cached.
I was planning to do some research in regards to this question… Firefox has some cache viewer plugins – all it takes is to take a peek (after getting the permission) at the cache of several _normal_ people (your Mom, your accountant, random non-geek guy in Starbucks, etc) and see how often the CDN files are there.
For obvious reasons the CDN is a better route. The use will most likely get an improvement in load time and it takes the load of serving that file off of your local server. win win
Of course, the research may just prove me wrong :) Checked my Opera cache – which is set to 400Mb and has over 20k files (the oldest one was accessed in August, 2010) – I do have all versions of jQuery cached. But then again – the websites I frequent are probably not the same as my audience and I do have the max cache size set explicitly.
Do you realize what are the chances jQuery UI version x.x.xx is already cached?
Here are the variables:
website that uses jQuery
website that uses jQuery UI
website that uses jQuery UI x.x.xx
website that uses jQuery UI x.x.xx from a CDN
website that uses jQuery UI x.x.xx from a CDN that is Google
I say, chances are your 80% of your users are going to download a 200K while visiting your website. Plus, keep in mind, 200KB of javascript are going to be in the browser memory, thus “slowing down” the whole experience.
I pick 20K script on a free CDN, and you should too.
I’m with Frederico. Even though jQuery is popular, jQuery UI is less so. If you think the stats of jQuery UI usage are anywhere near as high as jQuery itself, you are in for a shock.
The people responding, in my opinion, aren’t understanding the question properly. They’re responding: I will take cache. But the question is more accurately about risk mitigation:
1. user might have cached jQuery UI, but might not. Is the risk worth it to you for the larger potential download?
2. user will almost surely not have your custom version on first visit. But they will also have a much smaller download. Do you prefer that risk?
I pick #2 until the statistics show that visitors to the site I’m building have high odds of cached jQuery UI. I’d need to see the statistics first, saying that there’s an 80-90% chance they will have jQuery UI cached from the site I’m having it served from. Then I will change my tune.
I completely agree with Greg and Federico, the chances of loading a 200k specific version of a file from Google CDN is higher for the visitor, so I would go with 20k file that I have control over it too.
Also here are some stats about popularity of jQuery vs jQuery UI
http://trends.builtwith.com/javascript
I’m agreeing with the 20kb file. Additionally, to ease up on requests you could combine this file in with all your other javascript files to make one “script” tag, therefore one request. You can use a google tool called “minify” to do this, or simply create your own (as I did because I am using ASP … gah)
People say “ohh caching”, but it’s only a problem the first time. So a 20kb download and then it’s cached every time the users visits your site as well. Caching isn’t a sole property of CDN.
jQuery should be pre-packaged ( included ) with browsers.
Just like a side of salad that comes with the meal.
Wouldn’t that be impossible to keep up to date?
Every time the framework is updated you need to update your browser, and i guess we all know how well people update their browsers… No thanks i think i’d prefer using the cached version on the CDN. As others said, theres a big chance that people have a cached version of the framework already available to them.
The versioning is manageable :
jQuery in browsers should be updated separately from the browser itself, same way of ‘adobe flash’, you don’t have to update browser to get a new version of flash.
javascript is a client side thing, right ?
so why not have it bundled with browsers ?
keep in mind, that it would require ALL websites using jQ to be constantly updated – some changes between versions are quite big and need small ( yet important ) code update.
look @ jQ 1.4 ( still quite popular ) and jQ 1.6
I agree that it should be included by default since it’s open source and extremely popular, and as for having to update your browser all the time, this should be done in the background, as Chrome currently does.
Remember Google Chrome is already doing this with Flash, including it by default and updating it silently in the background.
I know Chrome does this very well, but we’re not living in an ideal world, we all know how well updated Internet Explorer is on most users pc’s (let alone large companies) a certain client of ours goes even further and blocks a lot of incoming traffic, which makes auto updates nearly impossible and I’m quite sure there are more companies who do so.
In addition I’m thinking of how that should be implemented, loading it by default would cause other frameworks, or self written code, to maybe behave different then the creators intended.
The x-factor is how many repeat visitors I have on my own site. Considering the TCP slowstart algorithm a 20 k file will get downloaded in 3 windows (3 + 5 + 7) segments. That will download in a reasoanble time for most uers.
Of course cache is faster, especially if fetched from an SSD, but given the fact that the larger library will parse and exute more slowly and use more memory I would chose the 20k self-hosted option on any site where visitors can be expected to be repeat visitors in about 50 percent of all visits.
The general internet connection for home users has a bandwidth of 8MBit/sec in Holland. This translates to 1Mb for the entire network at home. Considering at least someone uses Skype, someone opens YouTube and you’re trying to open a site, you can take for granted that he or she has a mere 200k/sec left on bandwidth. This makes the choice a geographically diverse call. If my client lives in the US, i’d definitely go for the 200k file on the CDN, but since most Dutch sites don’t follow good web standards, changes are pretty high that I’m the first to offer them the CDN link. This gives me a full second disadvantage of loading time. That’s huge!
I’m pretty confident as a developer, so a lot of times I’m prone to simply rewriting the jQuery UI libs to integrate “natively” in my other JS and then I’ll just serve a <20k file with all my JS combined. It really depends on the situation, but that's mostly the way we roll. /me hates the bandwidth problems.
*chances
Why stop there? Why not add more unnecessary scripts from the Google CDN since they’ll be cached? Doesn’t YAGNI stand for “You’re always gonna need it”? ;)
If you want, you could do what Yahoo did years ago and get some real metrics on how many of your visitors browse with a full cache vs. an empty cache experience.
What Yahoo found was that 40-60% of their own users had an empty cache, and about 20% of all page views were with an empty cache.
### end comment
Sidebar:
Personally, I prefer to keep my scripts concise and pared down only to what I really need, and serve a single, combined and minified file. I realize that “tabs” functionality is just an example here, but it confounds me why anyone would use jQueryUI just for tabs when they’re easy enough to implement with jQuery itself, or—heaven forbid—vanilla JavaScript.
We tried using the Google CDN for a while but in practice it was slower. The “theoretical” benefits never materialized and yet you will see this everywhere on a list of best practices and recommendations. I think it’s just common wisdom. People need to write articles about the Top 10 Things You Can Do Right Now because it’s their job. But they just repurpose ideas from other articles which got advice from other articles etc. Nobody really measures it. We did, and it was slower. We bundle jquery with our other site specific javascript in a single minified package for delivery.
I’m interested in how you measured the performance of Google’s CDN. Did you test it locally on your machines, or from analytics?
When using CDN, something you MUST not forget is to use the ALL of the version.
http://ajax.googleapis.com/ajax/libs/jqueryui/1.8/jquery-ui.min.js -> Cached 1h
http://ajax.googleapis.com/ajax/libs/jqueryui/1.8.14/jquery-ui.min.js -> Cache 1 year
I’d go with the CDN, only because that way, it may already be cached, and even if it isn’t, what will be lost during loading?… Tab functionality. I don’t think that would mean the end of the world.
I’d choose CDN, since web is open ecosystem and such choices help it to remain so. And since Chris has already mentioned very strong advantage of using CDN to take care of libraries, why use local copy?
It’s remarkable how strong the arguments are for a local file against a CDN version, whereas the CDN lovers come up with arguments they most likely take for granted and aren’t measured. Sure, there is a benefit to parallel downloading using multiple domains (an argument I haven’t spotted), but you could easily mimic that using your own, cookie-free subdomain (the cookie-free part is really important, since it drastically improves the request).
I picked 20K local because this seems to work best for small to medium web sites with local nature. Also you can merge this 20K with the rest of JS on your site into single JS file.
I still have a tendency to use local js libs other than the jquery one from google half the time.
Local. Think about mobile! Paying for 200k is more than paying for 20k. And same with waiting when you’re at some place where there’s no 3G.
I vote 20k local, for reasons already well surmised by Dominykas and Frederico.
In the real world if you are using jQuery you are also adding your own code, and possibly some jQuery plugins on top of that. In most cases it’s better to serve one single Javascript file than serve jQuery separately, then your own file as well.
Answer C: Implement metrics and A/B test it to figure out which delivers the file quicker.
The Resource Timing metrics that are being implemented in browsers will help a lot with this sort of thing: http://w3c-test.org/webperf/specs/ResourceTiming/
Doing A/B testing would be interesting, but I don’t see any very compelling reasons why you wouldn’t just setup AWS or another CDN to serve up the 20k file. It’s very cheap and easy, even if you serve millions of visitors.
Any downsides to that logic?
Paul’s got it right.
Eric: For some people, maybe cost? It’d be so tiny, though (at least in Europe and North America), that you could probably find the funds to pay for it by looking down at the sidewalk and finding some change that somebody dropped.
Or even though it’s relatively easy, the small technical barrier some people might perceive in setting up such a thing?
User traffic and user locations are important factors.
Here’s a discussion I came across on Stackoverflow: http://bit.ly/iXWBqQ
“When a specific request is made by a user, the server closest to that user (in terms of the minimum number of nodes between the server and the user) is dynamically determined. This optimizes the speed with which the content is delivered to that user.” – from the stack discussion
I can see how user location can be important in whether or not the CDN provider will actually improve load time; although I’m sure Google is well covered for jQuery needs.
Taking the example of jquery ui, the 200k vs 20k thing is fairly irrelevant since realistically in both cases they will be gzipped. After gzipping it is 51k vs 7k.
7K is not a lot to bundle into your other JS.
“Let’s say the file in question is jQuery UI. On your site, you only need it for it’s tabs functionality.”
its*
I wonder how many people struggle with this dilemma but then create websites with flash, funky “tinsel” and all sorts of other visual garnish?
Not saying people should disregard such considerations but there are other ways to save more bandwidth and save a few requests if it’s important.
I’d just use a CDN because it’ll be up to date, more likely to be cached, and because it’s one less thing I have to upload.
A practical side of me doesn’t like the 180 KB’s that remain unused. Why give it to the user if he won’t use it? It almost seems…unsemantic. The big factor I think is mobile networks – not only is 180 KB a network hog; the cache benefit is gone as well. With a growing use of mobile devices on the web, this is worth considering. Granted, these devices will get faster, but I’m pretty sure 180 KB will still cut into the data quota a bit. Additionally, I agree that not as many users will have the item cached as expected, and the geographical speed gain from the CDN is offset by the filesize. Also, as noted, the user is likely to return, resulting in a cached but smaller file. I do cede that if the “small” file adds a large enough timespan to pageload, the delay could dissuade potential users. On the other hand, firewalls or other filters often block CDN’s – an even bigger problem than loading latency. Ultimately, it’s a scenario specific question, at least for a few years still. Personally I’d go local.
Always local no matter the size.
Considering the bandwidth of my cheap shared virtual hosting and that of a CDN, it’s going to be the CDN. Even 20k is a large file by my standards, if I can avoid hosting it, I probably would.
A major CDN will be architecturally superior to my self-hosted solution, and is more likely to be fewer network hops too (if hosting directly at the ISP is still common these days – and I assume most users use their ISP’s DNS servers too).
It’s great to spend the time building your own download containing only what you’re using, as long as you’re prepared to do it again whenever you enhance your site and discover you need to include more features.
There are lots of reasons not to use a CDN and the size is just one consideration. Who wants to update their portfolio of websites when the CDN decides to house-keep away some old versions, or retire the service entirely?
I’d normally go the CDN route to save me bandwidth and to leverage caching. However, the promise in increased speed in my experience is far from true if the resource is not cached.
I very often do HTTP measuring and tuning and noticed that some CDNs really have a large variety in response times. Test it once and get it in a few ms. Test it again to see a spike of 600ms. It’s not a constant you can count on. It also varies greatly per location.
Let’s say that it would be a pretty light personal site – in that case, getting below Google’s threshold etc, pulling 200k (adding a few ms) from a CDN with uncertain response latencys would not really be a problem.
But if you’re more content heavy, having a few extra visitors a month, trimming bytes and combining (maybe setting up your own CDN) would really be an issue worth a lot of effort.
I (almost) always start developing with as many CDN’s as possible & local fallbacks (constantly measuring.) And in the end, if the need raises, I trim and combine accordingly.
So the answer to this poll thus, more or less, depend on what site and how much you need to save to get below the site’s individual thresholds of load time and num of requests, etc.
I prefer local,
e.g. I do not include Ext JS just for basic DOM actions. I mean Why should i load bytes i never use?!
Resources will be cached but why it is necessary?
Also connection speed is still limited in some countries….
I would go for the CDN for the simple reason of bandwidth. If you get a massive amount of traffic, every saved kb matters, and the fact that it is very likely to be cached along side the fact that it’s Google serving the script, means that there won’t be any noticeable load time increase.
But, as others have said, it really depends on the site and your infrastructure.
I would say … Host it yourself. 20k is nothing compared to today’s internet speeds. Moreover people with slower internet connections mostly have old browsers so using a CDN would just mean getting the large file and also, you can’t always rely on Cache.
However if you don’t have to host a large file let Google do the work for you
Browser cache is always better. The purpose of a website is to attract both new and returning visitors.
And google will check speed without taking into account a large (and common) file on a CDN…
I like to keep everything local. I just like being in control of what goes on with my stuff.
I’d go for a local small file.
This isn’t an argument about HTTP Request vs Browser Cache – it’s about small file on slower network vs large file on faster network.
I’d never assume any user of my site has something cached, if they see the page loading slowly they are not likely to care whether it’s on a CDN or local before deciding to go to my competitors.
I will load both, just in case :P
I always go the local file route. There is no way my users should have to load a 200k script when i have a 20k version. You can’t assume that it is already cached. Users on slower internet connections (yes, dial-up still exists for a large amount of people) can’t wait for a 200k script to download.
Plus, It’s more reliable to host it yourself. Not to mention that it’s a huge security flaw to run scripts from other sites. That just blows the door wide open for your site to be exploited. Even if the source is trusted it’s a bad idea, because they themselves could be compromised.
Okay, so I am so lazy that I choose CDN: Here’s why:
1. To go local, I have to get jquery, load it on my server, insert code for local referencing. Yuk!
2. To use CDN, insert the google code and done. To hell with the user, that saved me like 30 seconds and THAT’S all that counts.
This, in the end is an “Apples” Vs. “Oranges” kind of fight. Context is king here. Is you site a massive, well know portal to the internet? Is it a small fish in a big pond. Does your content span a wide swatch of user space, or is your content more of a niche domain? Is this a corporate intranet?
The answer to the question has less to do with how you serve your data and more to do with who you are serving your data to.
I was going to just say CDN, but then I saw the comment for using a local file as a fallback. For poll sake I’ll vote CDN, but I agree that the fallback option is an important piece here.
Great discusion, i`m with Jeremy on this one and thanks to @Da_n for pointing this out and to @Paul Irish for his answer C, once again Chris hit the nail on the head!
I choose the small file self-hosted, it will be combined with my other scripts, some 20k more or less don’t matter
20kb local file. Have we forgotten about mobile users? Are they likely to have a 200kb JavaScript file cached? How many people are using the CDN? Do we want to punish users who don’t have that file cached & potentially lose them? There are too many unknowns with the CDN option.
I would (& do) use Google’s CDN for jQuery, because I believe the advantages (caching) outweigh the negatives (extra HTTP request if you’ve got multiple JS files, as you can’t concat jQuery) in that case (where the file size is equal)
That’s not to mention the added time for the client spent parsing & executing that MASSIVE JS file. It’s a no-brainer.
I’d hope that any responsible front-end developer would not be using jQuery UI on a website anyway. It’s a web application JavaScript framework (for the lazy)
You have good and important points to bring up in your comments, and then you just wreck it by saying trollish shit like that. Why? It’s OK that you don’t like jQuery UI, it’s not OK to call other coders irresponsible and not back it up.
I didn’t think it would be relevant to the article to elaborate. But since you asked…
jQuery UI is simply too heavy to be appropriate for most websites when a much more appropriate solution would be to write your own plugins. Whilst the amount of options available in each module of jQuery UI is impressive, this means a lot of cruft that you are not using, which means extra file size & extra JS parsing.
A responsible front-end developer would take these things in to consideration when architecting a website’s JavaScript, and I would hope that they come to the solution that is best for the user (keeping the JS lean) rather than the solution that is best for them (using a pre-baked solution that can be made to do mostly what they want with some built-in options).
Totally agree with Jayphen.
Cool, that’s a fair opinion. I still disagree that jQuery UI is too heavy or bloated. They provide custom downloads for just the modules you need, and I don’t feel individual modules are bloated. They are function-rich, which can be hugely time-saving when it’s all the sudden determined you need a callback somewhere funky and they already have that covered.
I’ve worked for places where writing your own plugins from scratch would be far more irresponsible than using something that already exists. Totally depends on the situation.
Excuse me if that was curt before. But see these folks? http://jqueryui.com/about I’ve met and worked with a bunch of them and they are reallllly smart and work on all kinds of projects and use jQuery UI.
Chris I offer tributes to you and each jQueryUI DEV! You guys are great!!!
The point is… a PRO “will” (we say he should) write his own and custom code without a single extra line so obviously it will be local loaded. That’s how I read the word “responsible” from Jayphen. More: sometimes we are lazy!!! LOL
Of course! A lot of “pre-baked” solutions are really good – as jQueryUI. I would not say the same for jQuery Tools (IMO it WAS good, TERO did a great job and maybe ALIBBY will help to restore the code – I wish the best for them).
Back to your poll… the ones able to write a custom code will load local. The same ones CAN use a jQuery plugin (pre-baked) and IMO they will load local the same.
My English isn’t THAT good but I think you got what I mean. Did you? :)
20k file for javascript, every little bit of performance helps, and a 20 file will run and initilize faster than a 200k file. That tips any download speed scale I’d be using to weigh this decision.
20k local: Nearly 99.99% percent users have to download from your own host.
200k CDN: Big chance (80% or greater) is it already cached on users’ computer. Other is also not a big deal to download from Google CDN comparing with your own host.
In probability, winner is 200k CDN.
Making up stats to ‘prove’ a point: 55% likely to signify a lack of any actual research
90% of statistics are made up on the spot. The other 10% are just interpreted incorrectly.
I would make sure to have far-future expiration headers set and a pre-gzip compressed version hosted on my own server under a cookie-free sub domain specifically for static content. I used wget on my Dreamhost shared server once and it was getting speeds of 8-9MB/s (so I’m guessing a 100Mbps link unless the other service was limiting it). Of course that’s download and not upload, but upload easily reaches 2MB/s which is probably about as fast as a file that little is going to reach before completing.
Now we’ve determined Google CDN caching and speed are the same as our own server and bandwidth it normally pretty fat now-a-days on hosting accounts we just need to think about if it’s already cached.
Google CDN: 0.8s loading time on 2Mbps connection
Self-hosted: 0.08s loading time on 2Mbps connection
Of course you also need to use a specific version to get Google’s caching benefits. Configuring to use the latest will drop your cache time down to about 1hr. I think using a major version is about 2 days of caching. You need a specific release to take full advantage of it.
In Chromium’s cache of of 8k items that was cleared a couple days ago I have:
1.7 (no cache benefits)
1.7.2 (old, but cached)
I have about 6-8 different cached versions of jQuery on the other hand.
Even in Firefox with 56k cached items I have: (984MB)
1.8.2
1.8.3
1.8.6
None of those lineup with the version mentioned above. Google CDN might be a good bet if you’re using a very major release or one that’s been around for a while, but in this case it would fail. I sometimes use jQuery 1.2.6 when making webapps if I don’t require any new features since the old version is a lot smaller in size and already appears in my cache.
This doesn’t account for large corporations with caching servers setup (a Squid proxy that has 10-100GB of space to cache assets) that will serve a whole building of people. In this case the chance of the Google CDN version being ready to go is significantly higher. Overall I’d go for the smaller file with proper caching and compression set.
I second the 20k file hosted by the free cdn, with bulletproof fallback of course Couldn’t you just use teh cdn’ed base JQuery Lib, forgo all the unnecessary bloat that you may or may not need and kill two stones with one throw, or roll the stone downhill and collect no moss whichever pidgeon you like to pick from a bush (whoa sorry about all the cliché ‘s?
I will choose small file hosted on my server, concated with other js to one file, then minified.
I’m a huge fan of using a CDN but I love my local fallback, one of the reasons I love HMTL5 Boilerplate. Maybe it’s just me but it seems like on a small scale level comparing the 2 is like splitting hairs. Now far as jQuery UI goes if you’re using a non-custom version then CDN all the way. But at the end of the day I would suggest using what works best for the project.
At the end of the day less OCD and get more work done :P
I was hosting the file on my server, but this poll has changed my mind to go with CDN.
Posts like this make me worry about the kind of misinformation spread by these kinds of polls…
If the majority of other developers do the wrong thing, that doesn’t make it right.
Two people can look at exactly the same thing and see two totally different things. Never mind looking at the other side of coin. and yes don’t worry.
My Vote: 20k Local.
First of all, the exact script that you’re loading from Google’s CDN may not be cached. Are everyone loading the latest jQuery file from the CDN? No. 1.4.2 seems to be a pretty popular version actually. I also bet that not many sites use jQuery UI, so the whole caching argument seems hard to back up.
Second, unless you completely trust that the file will be already cached (I don’t), I’m pretty sure that it will be more efficient to load a much smaller file. I don’t really don’t care how fast Google’s CDN is, you will still probably be loading other scripts from your site anyway, so loading a 20k script shouldn’t take too long (especially as modern browsers are very good with parallel loading).
Instead of relying on a CDN, I personally think you should just optimize your website in general. Use subdomains or a different domain all together to load static files from. And if your website is big, you should probably be using your own CDN anyway.
will go with CDN.
I’ve just got to add here: No one, absolutely no one here brought up the issue of privacy at all in terms of hosting on a CDN, specifically on Google’s CDN. Referrer strings give browsing patterns of every visitor using the CDN to Google. Add that to all of the other Google-hosted services and from what I can tell, I can barely visit any site without them knowing about it.
I am somewhat surprised by how easily many developers simply hand over their traffic data without blinking an eye.
Free isn’t always free.
20k local, always.
+1
+1 me too.
Very well said. That’s one of the reasons I say “local” no matter the size.
100% agreed. it really is a problem, that so many people and especially web developers hand their users (and sometimes customers) data to facebook, google, etc.
unfortunately you break a lot of sites today by just blocking these domains locally because unobtrusive javascript seems to be very unpopular.
you should charge google for using their cdn. the data you give them is worth more than this 200kb.
whatever size, local. always.
I’d probably go with CDN first, then after a while I would invest in a decent self-hosted setup using no-trimmed down version of jQuery or other software.
I’d go with CDN as well since if your site is down and someone is watching your site through google cached version they’ll get a functional version instead of cropped plain html view…that’s why I’d put also all css files and images into Amazon CDN…
Downside with CDN : if using HTTPS you’ll get security warning for partially unsecure content…
Most CDNs like the Google one support HTTP and HTTPS. Write your script tag like this and the browser will retrieve the correct one depending on which protocol the current page is on:
Limit the requests to the server by minifying and combining your JS into one file. Your user is going to need all of your javascript anyway, so grab the condensed jQuery UI library, combine it into one file with your site’s JS, which will limit the latency due to multiple requests.
Most people who choose the CDN are arguing that cache is faster, which is perfectly true. However, the reality is that the file won’t be in the user’s cache until it has been downloaded at least once. Of course the same applies to the 20K file. It should be clear therefore that the local file will in fact be faster unless your web server is so slow that it would be quite useless anyway.
An additional reason to host the file locally is simply one of control. i.e. YOU decide what is in that file, not someone else.
Like Paul Irish says it’s best to test, but I can’t see the number of users that will have jQuery UI pre-cached being anything like big enough to justify an extra 180KB download for all the rest, which then has to be parsed and executed by the browser. Also, client caching is far from guaranteed, particularly on mobile devices.
Serve the 20KB that is required, minified and gzipped, and make sure it will be client cached where possible.
The discussion on whether libraries should be included in browsers is interesting. To work well I guess it would need a new JavaScript API which you could query to see if the library is included and then run if it is:
or maybe a new attribute for the script tag:
Food for thought.
Between 40% – 60% of Tesco customers arrive with an empty cache.
10% of those are always empty.
It’s a no-brainer 20Kb local everytime on a high performance website.
I will choose local but not only for the size difference, but because CDN introduces external dependencies. I know CDN are not supposed to take down the file but somehow I really like having everything bundled together in the same folder.
I used to experiment CDN until a customer couldn’t make the app work since its internal security policy was blocking every Google subdomain…
Conventional wisdom says go with the hosted solution but I am having trouble with the 200+k file size.
I try to keep my web pages below 100k so is 20k that big?
I would use a local file and find other ways to speed up the download (maybe host it on a different domain or use a better host).
Small file, Amazon CDN!
I know the main argument here is with file size, but there is more to performance than just that. What about parsing? Now we’re talking about having 10x more code on your site than you need, which the browser will need to parse…so I’d go with the 20k file. Yes, the user may need to download it the first time they visit your site, but it’ll be cached from then on and will be a lot less code for the browser to have to parse and use from then on.
After reading this discussion, my practice of using local copy has been justified.
Local file, small size. A 20k load, even if not parallel, will be 10x faster, on average, than the 200k load, assuming no lag time which occurs even on the largest of CDNs.
I’m not going to load a blocking js file that’s ten times bigger than it has to be, even if it MIGHT be cached based on a google cdn url. I’d prefer the on average faster load of the smaller blocking file.
In addition, if you’re loading something that houses any kind of functionality like that into a secure page, you’re opening yourself up for a world of hurt when it comes to cross domain security issues.
I’ll prefer use
I use both, being a newbie I want to make sure it gets loaded if the CDN server is down for some reason, better safe than sorry.
Is this not a good practice ?
Yes it is, but make sure not to load both of them :
Da_n said:
!window.jQuery && document.write(unescape(‘%3Cscript src=”/assets/js/libs/jquery-1.5.1.min.js”%3E%3C/script%3E’))
I’d prefer the 20k file, but not because of download time. Correct me if I’m wrong :
The overhead of having to execute and parse a 300kb js code on each page view is far worse than executing a 20kb js code. (I think this is how it works)
Try to load a full fledged jQuery, jQueryUI, and other heavy JS libraries in a page. The browser will cache them, but I’m quite sure the page will be faster if you load shorter JS, or no JS at all.
I would always favor the fastest JS so that the page keeps responsive.
I actually use both:
I got a minifier that grabs the JS libs from the Google CDN, compresses it along with my own scripts to a single file which is cached on my server. That way, i just have to deliver a single (and compressed) JS file while having the benefit of a easier version management (i just have to update the verison number in my view instead of downloading and replacing the files)
I prefer CDN too.