Grow your CSS skills. Land your dream job.

New Poll: Large file on CDN or small file local?

Published by Chris Coyier

The new poll is up (in the sidebar on the actual site, RSS folks) and it reads:

Would you rather host a 200k file on a major CDN or a 20k file self-hosted?

This requires a little explanation. Let's say the file in question is jQuery UI. On your site, you only need it for it's tabs functionality. You could load a copy of it from Google's CDN like this:

<script src="http://ajax.googleapis.com/ajax/libs/jqueryui/1.8.14/jquery-ui.min.js"></script>

That file is literally 205k, because it's the complete jQuery UI library. But, it's on what is probably the world's biggest/fastest/most-used CDN (Content Delivery Network). Lots of sites use jQuery UI and probably get it from there, so you have a decent chance of that file already being cached when a user gets to your site.

Or, you could download a customized copy of jQuery UI.

<script src="/js/jquery-ui-1.8.14.custom.min.js"></script>

This file could only be 20k, because it is trimmed down to only have the necessary stuff to make the tabs functionality work. That file is ten times smaller, but the chances of it being cached for a user coming to your site is nil if they are a first time visitor.

So all other things being equal, which do you choose?

Comments

  1. I will choose a small file hosted on own server. The visitor has to load contents from our own server. So, loading a 20K file from the same server with todays internet connection will not affect in anyway.

    • Jon
      Permalink to comment#

      You moron. Today’s internet connections have increased in bandwidth but latency hasn’t kept pace at all. The difference between loading an asset from your local cache and making an extra HTTP request is more a matter of latency than bandwidth. Using a CDN makes it much more likely to get a cache hit and not download the file at all.

  2. Enrique
    Permalink to comment#

    I chose CDN.

    200K already in user cache beat any http request, I dont care even if the file is just 1 byte big.

    • Adding a CDN will bring in another DNS lookup time. :-(

    • DNS lookups are cached too though, and the chances are ajax.googleapis.com will already be cached.

    • You are using DNS Manual Prefetching, aren’t you? Aren’t you?

    • Jon
      Permalink to comment#

      James, I don’t think you understand. Prefetching is NOT caching.

      I suggest you read a book or 2 before citing buzzwords like some kind of moronic HR Consultant.

    • Jon, I don’t think YOU understand.

      I was referring to the comment ‘Adding a CDN will bring in another DNS lookup time’. If you DNS Manual Prefetch, it will prefetch the DNS of the CDN when the HTML loads and not wait for the subsequent JS calls at the bottom of your page… Then when the JavaScript library is finally requested, the DNS will already be cached in the browser….

      If you honestly think that I thought DNS Manual Prefetching actually cached the JavaScript, maybe you should heed your own advice.

      Nice try though!

  3. I choose CDN too, cache wins all the time :D

  4. I choose CDN in first place, but with a local fallback. Could it maybe be another option in the poll?

  5. Evert
    Permalink to comment#

    All sites that use certain cdn services load like mud in my country because it seems that certain IPs are blocked (censored). Whenever I get to a site that takes 5 minutes or more to load (without eitehr css or js) I just know it is a cdn problem.
    I don’t think the actual cdn service is blocked, just certain Ips that are associated with it for some reason unknown to me. And basically this is annoying but it also taught me something. Do I want my visitors (who use an avarage of 2Mbit ADSL connection) to wait 2 seconds more for some file to load or run the risk that my site breaks for whatever reason because I have no control over the content?
    So I rather leave all files on my own server.

    • John Chapman
      Permalink to comment#

      Absolutely correct.
      And I really hate sites that pull from all over the place. I use NoScript. I like it when all the content comes from one location.

    • webnesto
      Permalink to comment#

      This has been my thinking for a while. I like the *idea* of CDN, but I think it’s introducing an unnecessary stability risk. And I know everyone likes to think that Google is impenetrable, but they will probably successfully be hacked one day. When that happens, and someone injects malicious password hacking code into the distributed copy of jQuery, boom goes the dynamite.

  6. Chris
    Permalink to comment#

    CDN for me too, they are generally faster for my purposes and allow for always up to date release usage which is always nice when you release software to the public.

    • webnesto
      Permalink to comment#

      Automatically using the most up to date version of a library, leaves you having to scramble to keep up when the API updates. I would much rather get to test the latest version in a dev/qa/staging environment first before accepting the development teams word that it is “backwards compatible”

  7. Definitely the 20k file.

  8. CDN all day everyday… it saves me bandwidth and the chances are high that is already cached anyway.

  9. TiGR
    Permalink to comment#

    Less file size – less code for browser to parse and execute. So I definitely choose local small file.

    But the size really matters. If the difference is 10 times as you say, then it’s worth it.

    • I agree. I do not want to load unnecessary libraries, css, js… the traffic is not a problem, but client memory, browser parsing is important.

      I prefere optimized code instead “flexible” code which loads bunch of unused code.

  10. Small local file, so far.

    The reason behind the decision is that the probability of having the CDN file cached is IMHO very low – the default browser cache sizes are REALLY small (50Mb on Firefox, automatic in IE8 – but hovers at 50-250Mb, 20Mb in Opera) – and that needs to include all the Youtube SWFs, images, CSS and quite often HTML for all the sites that the user visits. So, unless you’re using the same libraries AND the same CDN AND the same version library version as some of the really popular sites (e.g. Facebook, NYTimes) – the chances are that the file is not cached.

    I was planning to do some research in regards to this question… Firefox has some cache viewer plugins – all it takes is to take a peek (after getting the permission) at the cache of several _normal_ people (your Mom, your accountant, random non-geek guy in Starbucks, etc) and see how often the CDN files are there.

  11. Jake
    Permalink to comment#

    For obvious reasons the CDN is a better route. The use will most likely get an improvement in load time and it takes the load of serving that file off of your local server. win win

  12. Of course, the research may just prove me wrong :) Checked my Opera cache – which is set to 400Mb and has over 20k files (the oldest one was accessed in August, 2010) – I do have all versions of jQuery cached. But then again – the websites I frequent are probably not the same as my audience and I do have the max cache size set explicitly.

  13. paul
    Permalink to comment#

    jQuery should be pre-packaged ( included ) with browsers.

    Just like a side of salad that comes with the meal.

    • Ricardo
      Permalink to comment#

      Wouldn’t that be impossible to keep up to date?
      Every time the framework is updated you need to update your browser, and i guess we all know how well people update their browsers… No thanks i think i’d prefer using the cached version on the CDN. As others said, theres a big chance that people have a cached version of the framework already available to them.

    • paul
      Permalink to comment#

      The versioning is manageable :

      jQuery in browsers should be updated separately from the browser itself, same way of ‘adobe flash’, you don’t have to update browser to get a new version of flash.

      javascript is a client side thing, right ?
      so why not have it bundled with browsers ?

    • stryju
      Permalink to comment#

      keep in mind, that it would require ALL websites using jQ to be constantly updated – some changes between versions are quite big and need small ( yet important ) code update.

      look @ jQ 1.4 ( still quite popular ) and jQ 1.6

    • Permalink to comment#

      I agree that it should be included by default since it’s open source and extremely popular, and as for having to update your browser all the time, this should be done in the background, as Chrome currently does.

      Remember Google Chrome is already doing this with Flash, including it by default and updating it silently in the background.

    • Ricardo
      Permalink to comment#

      I know Chrome does this very well, but we’re not living in an ideal world, we all know how well updated Internet Explorer is on most users pc’s (let alone large companies) a certain client of ours goes even further and blocks a lot of incoming traffic, which makes auto updates nearly impossible and I’m quite sure there are more companies who do so.
      In addition I’m thinking of how that should be implemented, loading it by default would cause other frameworks, or self written code, to maybe behave different then the creators intended.

  14. The x-factor is how many repeat visitors I have on my own site. Considering the TCP slowstart algorithm a 20 k file will get downloaded in 3 windows (3 + 5 + 7) segments. That will download in a reasoanble time for most uers.

    Of course cache is faster, especially if fetched from an SSD, but given the fact that the larger library will parse and exute more slowly and use more memory I would chose the 20k self-hosted option on any site where visitors can be expected to be repeat visitors in about 50 percent of all visits.

  15. The general internet connection for home users has a bandwidth of 8MBit/sec in Holland. This translates to 1Mb for the entire network at home. Considering at least someone uses Skype, someone opens YouTube and you’re trying to open a site, you can take for granted that he or she has a mere 200k/sec left on bandwidth. This makes the choice a geographically diverse call. If my client lives in the US, i’d definitely go for the 200k file on the CDN, but since most Dutch sites don’t follow good web standards, changes are pretty high that I’m the first to offer them the CDN link. This gives me a full second disadvantage of loading time. That’s huge!
    I’m pretty confident as a developer, so a lot of times I’m prone to simply rewriting the jQuery UI libs to integrate “natively” in my other JS and then I’ll just serve a <20k file with all my JS combined. It really depends on the situation, but that's mostly the way we roll. /me hates the bandwidth problems.

  16. Why stop there? Why not add more unnecessary scripts from the Google CDN since they’ll be cached? Doesn’t YAGNI stand for “You’re always gonna need it”? ;)

    If you want, you could do what Yahoo did years ago and get some real metrics on how many of your visitors browse with a full cache vs. an empty cache experience.

    What Yahoo found was that 40-60% of their own users had an empty cache, and about 20% of all page views were with an empty cache.

    “[The results say] that even if your assets are optimized for maximum caching, there are a significant number of users that will always have an empty cache. This goes back to the earlier point that reducing the number of HTTP requests has the biggest impact on reducing response time.” [emphasis mostly mine]

    ### end comment

    Sidebar:
    Personally, I prefer to keep my scripts concise and pared down only to what I really need, and serve a single, combined and minified file. I realize that “tabs” functionality is just an example here, but it confounds me why anyone would use jQueryUI just for tabs when they’re easy enough to implement with jQuery itself, or—heaven forbid—vanilla JavaScript.

  17. Permalink to comment#

    We tried using the Google CDN for a while but in practice it was slower. The “theoretical” benefits never materialized and yet you will see this everywhere on a list of best practices and recommendations. I think it’s just common wisdom. People need to write articles about the Top 10 Things You Can Do Right Now because it’s their job. But they just repurpose ideas from other articles which got advice from other articles etc. Nobody really measures it. We did, and it was slower. We bundle jquery with our other site specific javascript in a single minified package for delivery.

  18. I’d go with the CDN, only because that way, it may already be cached, and even if it isn’t, what will be lost during loading?… Tab functionality. I don’t think that would mean the end of the world.

  19. Kushal
    Permalink to comment#

    I’d choose CDN, since web is open ecosystem and such choices help it to remain so. And since Chris has already mentioned very strong advantage of using CDN to take care of libraries, why use local copy?

  20. It’s remarkable how strong the arguments are for a local file against a CDN version, whereas the CDN lovers come up with arguments they most likely take for granted and aren’t measured. Sure, there is a benefit to parallel downloading using multiple domains (an argument I haven’t spotted), but you could easily mimic that using your own, cookie-free subdomain (the cookie-free part is really important, since it drastically improves the request).

  21. I picked 20K local because this seems to work best for small to medium web sites with local nature. Also you can merge this 20K with the rest of JS on your site into single JS file.

  22. I still have a tendency to use local js libs other than the jquery one from google half the time.

  23. Local. Think about mobile! Paying for 200k is more than paying for 20k. And same with waiting when you’re at some place where there’s no 3G.

  24. Scott
    Permalink to comment#

    I vote 20k local, for reasons already well surmised by Dominykas and Frederico.

    In the real world if you are using jQuery you are also adding your own code, and possibly some jQuery plugins on top of that. In most cases it’s better to serve one single Javascript file than serve jQuery separately, then your own file as well.

  25. Answer C: Implement metrics and A/B test it to figure out which delivers the file quicker.

    The Resource Timing metrics that are being implemented in browsers will help a lot with this sort of thing: http://w3c-test.org/webperf/specs/ResourceTiming/

    • Doing A/B testing would be interesting, but I don’t see any very compelling reasons why you wouldn’t just setup AWS or another CDN to serve up the 20k file. It’s very cheap and easy, even if you serve millions of visitors.

      Any downsides to that logic?

    • Permalink to comment#

      Paul’s got it right.

      Eric: For some people, maybe cost? It’d be so tiny, though (at least in Europe and North America), that you could probably find the funds to pay for it by looking down at the sidewalk and finding some change that somebody dropped.

      Or even though it’s relatively easy, the small technical barrier some people might perceive in setting up such a thing?

  26. Clayton
    Permalink to comment#

    User traffic and user locations are important factors.

    Here’s a discussion I came across on Stackoverflow: http://bit.ly/iXWBqQ

    “When a specific request is made by a user, the server closest to that user (in terms of the minimum number of nodes between the server and the user) is dynamically determined. This optimizes the speed with which the content is delivered to that user.” – from the stack discussion

    I can see how user location can be important in whether or not the CDN provider will actually improve load time; although I’m sure Google is well covered for jQuery needs.

  27. Alistair
    Permalink to comment#

    Taking the example of jquery ui, the 200k vs 20k thing is fairly irrelevant since realistically in both cases they will be gzipped. After gzipping it is 51k vs 7k.

    7K is not a lot to bundle into your other JS.

  28. hell
    Permalink to comment#

    “Let’s say the file in question is jQuery UI. On your site, you only need it for it’s tabs functionality.”

    its*

  29. Permalink to comment#

    I wonder how many people struggle with this dilemma but then create websites with flash, funky “tinsel” and all sorts of other visual garnish?

    Not saying people should disregard such considerations but there are other ways to save more bandwidth and save a few requests if it’s important.

    I’d just use a CDN because it’ll be up to date, more likely to be cached, and because it’s one less thing I have to upload.

  30. A practical side of me doesn’t like the 180 KB’s that remain unused. Why give it to the user if he won’t use it? It almost seems…unsemantic. The big factor I think is mobile networks – not only is 180 KB a network hog; the cache benefit is gone as well. With a growing use of mobile devices on the web, this is worth considering. Granted, these devices will get faster, but I’m pretty sure 180 KB will still cut into the data quota a bit. Additionally, I agree that not as many users will have the item cached as expected, and the geographical speed gain from the CDN is offset by the filesize. Also, as noted, the user is likely to return, resulting in a cached but smaller file. I do cede that if the “small” file adds a large enough timespan to pageload, the delay could dissuade potential users. On the other hand, firewalls or other filters often block CDN’s – an even bigger problem than loading latency. Ultimately, it’s a scenario specific question, at least for a few years still. Personally I’d go local.

  31. Always local no matter the size.

  32. Considering the bandwidth of my cheap shared virtual hosting and that of a CDN, it’s going to be the CDN. Even 20k is a large file by my standards, if I can avoid hosting it, I probably would.

    A major CDN will be architecturally superior to my self-hosted solution, and is more likely to be fewer network hops too (if hosting directly at the ISP is still common these days – and I assume most users use their ISP’s DNS servers too).

    It’s great to spend the time building your own download containing only what you’re using, as long as you’re prepared to do it again whenever you enhance your site and discover you need to include more features.

  33. There are lots of reasons not to use a CDN and the size is just one consideration. Who wants to update their portfolio of websites when the CDN decides to house-keep away some old versions, or retire the service entirely?

  34. I’d normally go the CDN route to save me bandwidth and to leverage caching. However, the promise in increased speed in my experience is far from true if the resource is not cached.

    I very often do HTTP measuring and tuning and noticed that some CDNs really have a large variety in response times. Test it once and get it in a few ms. Test it again to see a spike of 600ms. It’s not a constant you can count on. It also varies greatly per location.

  35. Niclas
    Permalink to comment#

    Let’s say that it would be a pretty light personal site – in that case, getting below Google’s threshold etc, pulling 200k (adding a few ms) from a CDN with uncertain response latencys would not really be a problem.
    But if you’re more content heavy, having a few extra visitors a month, trimming bytes and combining (maybe setting up your own CDN) would really be an issue worth a lot of effort.

    I (almost) always start developing with as many CDN’s as possible & local fallbacks (constantly measuring.) And in the end, if the need raises, I trim and combine accordingly.

    So the answer to this poll thus, more or less, depend on what site and how much you need to save to get below the site’s individual thresholds of load time and num of requests, etc.

  36. Permalink to comment#

    I prefer local,
    e.g. I do not include Ext JS just for basic DOM actions. I mean Why should i load bytes i never use?!
    Resources will be cached but why it is necessary?
    Also connection speed is still limited in some countries….

  37. I would go for the CDN for the simple reason of bandwidth. If you get a massive amount of traffic, every saved kb matters, and the fact that it is very likely to be cached along side the fact that it’s Google serving the script, means that there won’t be any noticeable load time increase.

    But, as others have said, it really depends on the site and your infrastructure.

  38. I would say … Host it yourself. 20k is nothing compared to today’s internet speeds. Moreover people with slower internet connections mostly have old browsers so using a CDN would just mean getting the large file and also, you can’t always rely on Cache.

  39. Browser cache is always better. The purpose of a website is to attract both new and returning visitors.

    And google will check speed without taking into account a large (and common) file on a CDN…

  40. Ken
    Permalink to comment#

    I like to keep everything local. I just like being in control of what goes on with my stuff.

  41. Colin Robertson
    Permalink to comment#

    I’d go for a local small file.

    This isn’t an argument about HTTP Request vs Browser Cache – it’s about small file on slower network vs large file on faster network.

    I’d never assume any user of my site has something cached, if they see the page loading slowly they are not likely to care whether it’s on a CDN or local before deciding to go to my competitors.

  42. The Smartest
    Permalink to comment#

    I will load both, just in case :P

  43. I always go the local file route. There is no way my users should have to load a 200k script when i have a 20k version. You can’t assume that it is already cached. Users on slower internet connections (yes, dial-up still exists for a large amount of people) can’t wait for a 200k script to download.

    Plus, It’s more reliable to host it yourself. Not to mention that it’s a huge security flaw to run scripts from other sites. That just blows the door wide open for your site to be exploited. Even if the source is trusted it’s a bad idea, because they themselves could be compromised.

  44. Lazy Garry
    Permalink to comment#

    Okay, so I am so lazy that I choose CDN: Here’s why:

    1. To go local, I have to get jquery, load it on my server, insert code for local referencing. Yuk!

    2. To use CDN, insert the google code and done. To hell with the user, that saved me like 30 seconds and THAT’S all that counts.

  45. Mike Tallent
    Permalink to comment#

    This, in the end is an “Apples” Vs. “Oranges” kind of fight. Context is king here. Is you site a massive, well know portal to the internet? Is it a small fish in a big pond. Does your content span a wide swatch of user space, or is your content more of a niche domain? Is this a corporate intranet?

    The answer to the question has less to do with how you serve your data and more to do with who you are serving your data to.

  46. I was going to just say CDN, but then I saw the comment for using a local file as a fallback. For poll sake I’ll vote CDN, but I agree that the fallback option is an important piece here.

    • Permalink to comment#

      Great discusion, i`m with Jeremy on this one and thanks to @Da_n for pointing this out and to @Paul Irish for his answer C, once again Chris hit the nail on the head!

  47. I choose the small file self-hosted, it will be combined with my other scripts, some 20k more or less don’t matter

  48. 20k file for javascript, every little bit of performance helps, and a 20 file will run and initilize faster than a 200k file. That tips any download speed scale I’d be using to weigh this decision.

  49. allen
    Permalink to comment#

    20k local: Nearly 99.99% percent users have to download from your own host.
    200k CDN: Big chance (80% or greater) is it already cached on users’ computer. Other is also not a big deal to download from Google CDN comparing with your own host.
    In probability, winner is 200k CDN.

  50. I would make sure to have far-future expiration headers set and a pre-gzip compressed version hosted on my own server under a cookie-free sub domain specifically for static content. I used wget on my Dreamhost shared server once and it was getting speeds of 8-9MB/s (so I’m guessing a 100Mbps link unless the other service was limiting it). Of course that’s download and not upload, but upload easily reaches 2MB/s which is probably about as fast as a file that little is going to reach before completing.

    Now we’ve determined Google CDN caching and speed are the same as our own server and bandwidth it normally pretty fat now-a-days on hosting accounts we just need to think about if it’s already cached.

    Google CDN: 0.8s loading time on 2Mbps connection
    Self-hosted: 0.08s loading time on 2Mbps connection

    Of course you also need to use a specific version to get Google’s caching benefits. Configuring to use the latest will drop your cache time down to about 1hr. I think using a major version is about 2 days of caching. You need a specific release to take full advantage of it.

  51. In Chromium’s cache of of 8k items that was cleared a couple days ago I have:
    1.7 (no cache benefits)
    1.7.2 (old, but cached)

    I have about 6-8 different cached versions of jQuery on the other hand.

    Even in Firefox with 56k cached items I have: (984MB)
    1.8.2
    1.8.3
    1.8.6

    None of those lineup with the version mentioned above. Google CDN might be a good bet if you’re using a very major release or one that’s been around for a while, but in this case it would fail. I sometimes use jQuery 1.2.6 when making webapps if I don’t require any new features since the old version is a lot smaller in size and already appears in my cache.

    This doesn’t account for large corporations with caching servers setup (a Squid proxy that has 10-100GB of space to cache assets) that will serve a whole building of people. In this case the chance of the Google CDN version being ready to go is significantly higher. Overall I’d go for the smaller file with proper caching and compression set.

  52. I second the 20k file hosted by the free cdn, with bulletproof fallback of course Couldn’t you just use teh cdn’ed base JQuery Lib, forgo all the unnecessary bloat that you may or may not need and kill two stones with one throw, or roll the stone downhill and collect no moss whichever pidgeon you like to pick from a bush (whoa sorry about all the cliché ‘s?

  53. I will choose small file hosted on my server, concated with other js to one file, then minified.

  54. I’m a huge fan of using a CDN but I love my local fallback, one of the reasons I love HMTL5 Boilerplate. Maybe it’s just me but it seems like on a small scale level comparing the 2 is like splitting hairs. Now far as jQuery UI goes if you’re using a non-custom version then CDN all the way. But at the end of the day I would suggest using what works best for the project.

    At the end of the day less OCD and get more work done :P

  55. I was hosting the file on my server, but this poll has changed my mind to go with CDN.

    • Posts like this make me worry about the kind of misinformation spread by these kinds of polls…

      If the majority of other developers do the wrong thing, that doesn’t make it right.

    • Two people can look at exactly the same thing and see two totally different things. Never mind looking at the other side of coin. and yes don’t worry.

  56. My Vote: 20k Local.

    First of all, the exact script that you’re loading from Google’s CDN may not be cached. Are everyone loading the latest jQuery file from the CDN? No. 1.4.2 seems to be a pretty popular version actually. I also bet that not many sites use jQuery UI, so the whole caching argument seems hard to back up.

    Second, unless you completely trust that the file will be already cached (I don’t), I’m pretty sure that it will be more efficient to load a much smaller file. I don’t really don’t care how fast Google’s CDN is, you will still probably be loading other scripts from your site anyway, so loading a 20k script shouldn’t take too long (especially as modern browsers are very good with parallel loading).

    Instead of relying on a CDN, I personally think you should just optimize your website in general. Use subdomains or a different domain all together to load static files from. And if your website is big, you should probably be using your own CDN anyway.

  57. Permalink to comment#

    will go with CDN.

  58. I’ve just got to add here: No one, absolutely no one here brought up the issue of privacy at all in terms of hosting on a CDN, specifically on Google’s CDN. Referrer strings give browsing patterns of every visitor using the CDN to Google. Add that to all of the other Google-hosted services and from what I can tell, I can barely visit any site without them knowing about it.

    I am somewhat surprised by how easily many developers simply hand over their traffic data without blinking an eye.

    Free isn’t always free.

    20k local, always.

    • peter
      Permalink to comment#

      +1

    • +1 me too.
      Very well said. That’s one of the reasons I say “local” no matter the size.

    • 100% agreed. it really is a problem, that so many people and especially web developers hand their users (and sometimes customers) data to facebook, google, etc.

      unfortunately you break a lot of sites today by just blocking these domains locally because unobtrusive javascript seems to be very unpopular.

      you should charge google for using their cdn. the data you give them is worth more than this 200kb.

      whatever size, local. always.

  59. I’d probably go with CDN first, then after a while I would invest in a decent self-hosted setup using no-trimmed down version of jQuery or other software.

  60. I’d go with CDN as well since if your site is down and someone is watching your site through google cached version they’ll get a functional version instead of cropped plain html view…that’s why I’d put also all css files and images into Amazon CDN…

  61. Downside with CDN : if using HTTPS you’ll get security warning for partially unsecure content…

  62. Andrew
    Permalink to comment#

    Limit the requests to the server by minifying and combining your JS into one file. Your user is going to need all of your javascript anyway, so grab the condensed jQuery UI library, combine it into one file with your site’s JS, which will limit the latency due to multiple requests.

  63. John G
    Permalink to comment#

    Most people who choose the CDN are arguing that cache is faster, which is perfectly true. However, the reality is that the file won’t be in the user’s cache until it has been downloaded at least once. Of course the same applies to the 20K file. It should be clear therefore that the local file will in fact be faster unless your web server is so slow that it would be quite useless anyway.

    An additional reason to host the file locally is simply one of control. i.e. YOU decide what is in that file, not someone else.

  64. David Goss
    Permalink to comment#

    Like Paul Irish says it’s best to test, but I can’t see the number of users that will have jQuery UI pre-cached being anything like big enough to justify an extra 180KB download for all the rest, which then has to be parsed and executed by the browser. Also, client caching is far from guaranteed, particularly on mobile devices.

    Serve the 20KB that is required, minified and gzipped, and make sure it will be client cached where possible.

    The discussion on whether libraries should be included in browsers is interesting. To work well I guess it would need a new JavaScript API which you could query to see if the library is included and then run if it is:

    if (library.has("jQuery 1.6.2")) {
     library.run("jQuery 1.6.2");
    } else {
     // load script from server
    }

    or maybe a new attribute for the script tag:

    <script src="/js/jquery.js" library="jQuery 1.6.2"></script>

    Food for thought.

  65. I will choose local but not only for the size difference, but because CDN introduces external dependencies. I know CDN are not supposed to take down the file but somehow I really like having everything bundled together in the same folder.

  66. Boris Delormas
    Permalink to comment#

    I used to experiment CDN until a customer couldn’t make the app work since its internal security policy was blocking every Google subdomain…

  67. Muditha
    Permalink to comment#

    Conventional wisdom says go with the hosted solution but I am having trouble with the 200+k file size.

    I try to keep my web pages below 100k so is 20k that big?

    I would use a local file and find other ways to speed up the download (maybe host it on a different domain or use a better host).

  68. Small file, Amazon CDN!

  69. Mike
    Permalink to comment#

    I know the main argument here is with file size, but there is more to performance than just that. What about parsing? Now we’re talking about having 10x more code on your site than you need, which the browser will need to parse…so I’d go with the 20k file. Yes, the user may need to download it the first time they visit your site, but it’ll be cached from then on and will be a lot less code for the browser to have to parse and use from then on.

  70. mrtosiba
    Permalink to comment#

    After reading this discussion, my practice of using local copy has been justified.

  71. patrick
    Permalink to comment#

    Local file, small size. A 20k load, even if not parallel, will be 10x faster, on average, than the 200k load, assuming no lag time which occurs even on the largest of CDNs.

    I’m not going to load a blocking js file that’s ten times bigger than it has to be, even if it MIGHT be cached based on a google cdn url. I’d prefer the on average faster load of the smaller blocking file.

    In addition, if you’re loading something that houses any kind of functionality like that into a secure page, you’re opening yourself up for a world of hurt when it comes to cross domain security issues.

  72. I’ll prefer use

  73. Tony
    Permalink to comment#

    I use both, being a newbie I want to make sure it gets loaded if the CDN server is down for some reason, better safe than sorry.
    Is this not a good practice ?

    • Jeremy William
      Permalink to comment#

      Yes it is, but make sure not to load both of them :

      Da_n said:

      !window.jQuery && document.write(unescape(‘%3Cscript src=”/assets/js/libs/jquery-1.5.1.min.js”%3E%3C/script%3E’))

  74. Artscoop
    Permalink to comment#

    I’d prefer the 20k file, but not because of download time. Correct me if I’m wrong :
    The overhead of having to execute and parse a 300kb js code on each page view is far worse than executing a 20kb js code. (I think this is how it works)
    Try to load a full fledged jQuery, jQueryUI, and other heavy JS libraries in a page. The browser will cache them, but I’m quite sure the page will be faster if you load shorter JS, or no JS at all.

    I would always favor the fastest JS so that the page keeps responsive.

  75. Bastian Schwarz
    Permalink to comment#

    I actually use both:

    I got a minifier that grabs the JS libs from the Google CDN, compresses it along with my own scripts to a single file which is cached on my server. That way, i just have to deliver a single (and compressed) JS file while having the benefit of a easier version management (i just have to update the verison number in my view instead of downloading and replacing the files)

  76. Nick
    Permalink to comment#

    I prefer CDN too.

This comment thread is closed. If you have important information to share, you can always contact me.

*May or may not contain any actual "CSS" or "Tricks".