You’ve seen this before:
<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.4/jquery.min.js"></script>
This is a way you can load a JavaScript library like jQuery directly from Google’s CDN (Content Delivery Network). You can get quick copy/paste access to these from ScriptSrc.net.
See in that above URL how it is pointing to 1.4.4 specifically? That little part of the URL can be tweaked. Perhaps you’ve seen it used that way before.
/1.4.4/ | Loads this very specific version of jQuery which will never change. |
---|---|
/1.4/ | Today, this will load 1.4.4. If and when jQuery goes 1.4.5, this URL will now point to that. If and when jQuery goes to 1.5, it will link to the last release of 1.4. |
/1/ | Today, this will load 1.4.4. If and when jQuery goes 1.5, this URL will now now point to that. If and when jQuery goes 2.0, it will link to the last release of jQuery 1.x. |
As far as I know there is no super reliable way to link to the “absolute latest” build of jQuery (e.g. that would still work if jQuery went 2.0 and include nightly builds). Let’s figure out which one we should use.
A Little Reminder Why We Do This At All
The reasons for doing this are best put by Dave Ward:
- Decreased Latency – file is server from a literally-geographically-closer server.
- Increased Parallelism – browsers limit how many resources can be downloaded simultaneously from a single domain, some as low as two. Since this is google.com not yoursite.com, it doesn’t count toward that limit.
- Better Caching – There is a pretty decent chance that your visitors already have a copy of this file in their browsers cache which is the fastest possible way to load it.
To which I would add:
- Saves Bandwidth – The minified version of jQuery 1.4.4 is 82k. So if you had 1,000,000 pageviews with no local caching, that’s 78 GB of data transfer which is not insignificant.
Caching Headers
#1 and #2 above are going to help no matter what, but caching needs a little bit more discussion. It turns out which naming convention you use is highly important to the caching aspect. Paul Irish has some basic research on this. I re-did that research and the results are below. Turns out, only linking to a direct exact version has proper caching headers.
/1.4.4/ | One Year | public, max-age=31536000 |
screenshot |
---|---|---|---|
/1.4/ | One Hour, strictly | public, must-revalidate, proxy-revalidate, max-age=3600 |
screenshot |
/1/ | One Hour, strictly | public, must-revalidate, proxy-revalidate, max-age=3600 |
screenshot |
One hour in this context is kinda useless. It does kind of make sense though. If 1.4.5 came out, anyone who is using a /1.4/ link and had a one-year expires header would still get 1.4.4 and that’s no good.
Latency, Parallelism, and Bandwidth are still significant things, but pale in comparison to caching. So if caching is super important to you, linking to a direct version is your only option.
Which to Choose
/1.4.4/ | Will never change, so will never break code. Best caching. Clearest to understand. |
---|---|
/1.4/ | Possible but unlikely to break code with future updates (sub point releases more bug-fixy than API-changy). Fairly useless level of caching. |
/1/ | More likely to break code with future updates (point release might change things). Fairly useless level of caching. |
I would say your best bet is to use the direct version links in almost all scenarios. The point-only links are pretty useless. The version-only links I could see being useful for personal demos where you kind of want your own demo to break so you know how to fix it.
Combining Scripts
If you are able to squish JavaScript files down into one file and serve them like that (like perhaps this or this) you may be better off doing that, even if they are coming from your own server or CDN. One request is almost always better than multiple.
Not Just jQuery
This same naming convention / structure is used for all the JavaScript libraries on Google’s CDN. I tested MooTools and all the exact same stuff applies.
Other CDN’s
There are other CDN’s that host jQuery as well: Microsoft and jQuery.com itself. Neither of these have the same kind of naming convention so this doesn’t really matter. However, do note that a direct link on Microsoft’s CDN does the nice one-year cache.
<script src="http://ajax.microsoft.com/ajax/jquery/jquery-1.4.4.min.js"></script>
jQuery.com’s version doesn’t seem to send any caching information in the header at all.
<script src="http://code.jquery.com/jquery-1.4.4.min.js"></script>
The Microsoft CDN domain has recently been changed from “ajax.microsoft.com” to “ajax.aspnetcdn.com”, to make it a totally cookie-less domain. See http://www.asp.net/ajaxlibrary/cdn.ashx
Nice one, thanks for the tip!
First off, nice post.
Just a note on the “Combining Scripts” portion:
While you are quite right about multiple requests being bad, IMO it cannot be bad to use the caching that Google’s and other CDNs provide for things the size of jQuery (which I believe is around 30k) for higher traffic sites. I tend to lean on the side of using Google’s CDN to pull down jQuery and then combine all of my site specific scripts into one file. Although I am interested in everyones thoughts on the way I do this.
I used to have php serve a custom-built javascript file per page, but it’s a lot of extra programming for not too much benefit. caching is more valuable. I sometimes still include a local fallback for the cdn (see below) but only when jQuery is actually mission-critical.
Much cleaner is this:
I’ve heard several times “Document.write must die!”
So… I tend to avoid it when possible. Check this out for loading jQuery when not yet loaded: http://www.learningjquery.com/2009/04/better-stronger-safer-jquerify-bookmarklet/ (yes, it’s a bookmarklet, but the code can be adapted)
I have some offending code I’m going to look into fixing right now.
@Xaver: is it faster / more reliable? it does look cleaner.
@Chris: I know, but it seems to work fine in some situations. I’ve never read anything that convinced me it’s completely wrong and evil under all circumstances.
Anyway, I’m not a javascript whiz yet -I’m a php guy- but I’m learning. I pretty much copied that from a forum comment by John Resig.
I just improved the snippet:
This is in use on the examples on this very site. The “view fancy source” thing is jQuery powered, but not all examples use jQuery. So in the footer include (invisible, just analytics basically) I also handle the view source thing, and it uses this to determine if it needs to load in jQuery or not.
The document.write solution is used in the HTML5Boilerplate: https://github.com/paulirish/html5-boilerplate/blob/master/index.html#L64
I think it’s a good solution in this case, but in general I agree that document.write should be avoided.
“As far as I know there is no super reliable way to link to the “absolute latest” build of jQuery”
I use
to link to the newest version of jquery. This may not be reliable as everything I’ve learned about jQuery was from you. so who knows where I picked that one up :-)
http://code.jquery.com/jquery-latest.js
Ah, there you go. Interestingly enough, this http://code.jquery.com/jquery-nightly.js is outdated, and the actual nightly I think is the 1.4.5pre (of course, at the time of this writing)
Chris,
You can get the latest jQuery code from Git with http://code.jquery.com/jquery-git.js
This was announced here http://blog.jquery.com/2010/10/24/community-updates-2610/
Ralph
IMO, it isn’t always best to combine all your scripts into one file. Let’s say in a simple case you have something like this going on:
jquery-1.4.4.min.js (obviously the core jquery library)
utils.js (let’s say this is all your site-specific crappy code)
lightbox.js (to show stupid photos)
If you can combine those in a build script, that’s cool in theory, but any change to any file will invalidate that combined file and you’ll need to push a new one out. And don’t forget, you’ll need a cache-busting mechanism in this file like a date stamp added to the file name: combinedJS_20101126.js.
Is the code in the version of jQuery you’re using going to change? No. Is the lightbox.js going to change? Odds are no (you can modify it in dev, but in production it’s probably not going to change).
What will change? Any site-specific js, i.e., utils.js.
Also, files don’t stay in your cache for as long as you think. Browser caches aren’t really keeping up with the times, I think most still ship with a 50mb default, though IE9 will be bumping theirs up from what I’ve read. You can easily fill that up browsing around in an hour. Let’s face it, a lot of home pages are pushing nearly a megabyte at you (if not more) nowadays.
HTML5’s cache manifest will be interesting, but also sure to be abused.
http://ajax.googleapis.com/… or https://ajax.googleapis.com/… ? Google jQuery
To avoid that problem:
Thank yuo for the Script links. I always use to have hard time finding the CDN Links for these scripts. :)
To back up Joe’s comment, there’s also the minified version:
http://code.jquery.com/jquery-latest.min.js
Very nice post chris! Appreciate the work you put in.
“Decreased Latency – file is server from a literally-geographically-closer server.”
I would like to point out that is a simplification of the truth. Quite often the server of Google may not be nearer at all, and even if it is, occassionally it can still take several hundred ms of latency for the file to be retrieved. This of course all depends on various parameters as well as sheer luck. Therefore, I find improved caching odds a much bigger advantage then actual latency improvements.
In the Paul Irish Boilerplate older one he use :
And in the last one, he change “” by “%3E”
I want to now if there is any difference or some issue with the first one.
This is a great post for those of us not using a CDN yet, very informative.
Using Google CDN: Can we do this on sites that have secure information? what if Google site get’s hacked (although possibility is very minor) and somebody puts a malicious script in jquery.min.js, wouldn’t that affect all sites that use it?
sure!
Thank you for the Scripts!
Great tut and really good looking as always!