The following is a guest post by Jesse Shawl.
In May of 2012, Chris updated a previous post about dynamic page replacing content. This article is an update to that update, which uses the HTML5 history API for a better user experience.
Here’s a quick recap of the best practices:
- Works fine with JavaScript disabled.
- It is possible to "deep link" to specific content.
- The browsers back button and forward button work as expected.
The Problem With URL Hashes
For one individual user, the existing demo meets the criteria just fine, but URLs are permanent addresses, and they’re going to be shared.
Consider the following scenario:
- I’ve got a fancy browser with Javascript enabled. I’m browsing the demo site, and I find a great product I’d like to share with a friend.
- I copy the url “http://example.com/#awesome-product”, and send it to my friend.
- My friend doesn’t have javascript enabled. She opens the link in her browser, and is confused that the awesome product doesn’t load as expected.
- She gets confused/frustrated and swears never to visit example.com again.
THIS IS BAD UX!
Today, we’ll be improving the existing demo such that the dynamic page replacing content doesn’t rely on the hash.
Modernizr for Progressive Enhancement
Note: The following examples build upon the previous demo. Download the files here to follow along.
If you’re not using Modernizr yet, go get it (I’ll wait). It’s the easiest way to detect browser features with JavaScript.
Since we’ll be playing with the HTML5 history API, we only need to check the “History” checkbox. Download the custom build here.
Include it in the <head>
of our html file:
<script src='js/modernizr.js'></script>
Testing for HTML5 history support is super easy:
// dynamicpage.js
$(function() {
if (Modernizr.history) {
// history is supported; do magical things
} else {
// history is not supported; nothing fancy here
}
});
First, we’re going to set up everything to manipulate the browser’s history, and then we’ll add all the fancy loading provided from the previous demo.
Manipulate the History with HTML5 History API
The HTML5 history.pushState()
method allows us to:
- Change the URL
- without a hash
- without a page refresh (this is where the dynamic page replacing content happens)
- Update the browser's history stack
- so we can navigate through the history with back and forward button clicks.
The pushState()
method takes three parameters:
history.pushState(stateObject, "title", URL);
We’re only going to be supplying the URL in this example, but you can learn more about the history API over at the Mozilla Developer Network.
After changing the URL, we’ll want to set up a function to load the content – loadContent()
seems like a good name.
$(function() {
if (Modernizr.history) {
// history is supported; do magical things
// hijack the nav click event
$("nav").delegate("a", "click", function() {
_href = $(this).attr("href");
// change the url without a page refresh and add a history entry.
history.pushState(null, null, _href);
// load the content
loadContent(_href); // fear not! we're going to build this function in the next code block
});
} else {
// history is not supported; nothing fancy here
}
});
And now, we just need to code up the loadContent()
function, which is a matter of taking code from the original example.
Code dump:
// set up some variables
var $mainContent = $("#main-content"),
$pageWrap = $("#page-wrap"),
baseHeight = 0,
$el;
// calculate wrapper heights to prevent jumping when loading new content
$pageWrap.height($pageWrap.height());
baseHeight = $pageWrap.height() - $mainContent.height();
function loadContent(href) {
$mainContent
.find("#guts")
.fadeOut(200, function() { // fade out the content of the current page
$mainContent
.hide()
.load(href + " #guts", function() { // load the contents of whatever href is
$mainContent.fadeIn(200, function() {
$pageWrap.animate({
height: baseHeight + $mainContent.height() + "px"
});
});
$("nav a").removeClass("current");
$("nav a[href$='" + href + "']").addClass("current");
});
});
}
Handle browser back and forward button clicks
At this point, content is loaded in a fancy ajaxy way, but clicking on your back button won’t take us back… yet.
The history API gives us access to the popstate
event, which fires everytime the history stack changes (read: back and/or forward buttons are clicked.) Anytime this event fires, we just need to call our loadContent()
function:
$(window).bind("popstate", function() {
link = location.pathname.replace(/^.*[\\/]/, ""); // get filename only
loadContent(link);
});
A Little Homework Assignment
At the time of this writing, the popstate
event happens on page load in Chrome. This means two requests are being made:
- The original http request for whateverpage.html
- The request made by
$.load
in ourloadContent()
function
There are a couple of different ways to handle this, but I’ll let you decide which works best.
If you’re still supporting users that have javascript switched off you’re part of the problem not the solution.
Web developers need to stop wasting their time pandering to a tiny proportion of internet users and help to push the web forward by making the web a bad place to be if you use “old IE” or have javascript switched off.
Using the web in 2013 means using javascript.
I agree 100%.
Especially with “stop wasting their time pandering to a tiny proportion of internet”.
Agree
Ditto.
I agree in a sense…I completely ignore old IE when developing most websites on my own. However, my company works with a lot of state DOTs and other government agencies which require IE7, IE8, and other awful things. So, as much as I don’t want to contribute to the problem, my job and paycheck sometimes depends on it. :-/ Blame the slow government agencies if you want to blame anyone.
While I agree that anyone browsing with javascript turned off is probably part of a very very small group, and likely doesn’t realize they have it turned off, and that the way forward is to generally let go of backwards-compatible practices, I also think that there are certainly situations where clients have demands that are part of a bigger picture than they realize, and expect compatibility for all of their customers, even the tiny amount that might not have java enabled, and for those situations I am really glad I have a resource as excellent as this website to provide help on those specific scenarios.
Thanks Chris and Jesse!
For the record, this demo does exactly that: nothing special for non-JS users. Yet, works.
I also hate to be dogmatic about something like whether or not you choose to support non-JS users, I have worked on quite a few apps that largely ignored the non-JS case and it’s never been a big deal.
Does anyone have any statistics on the percentage of users not using JS? I would have to guess it’s incredibly small.
The percentage of users with JS explicitly turned off is small. The percentage of users with crazy GreaseMonkey scripts & Chrome extensions that will mess with your DOM and break your JS is huge, especially if your audience is devs. Progressive enhancement FTW.
I agree with Ben. This is not about supporting users with JS turned off but about not breaking web conventions.
What happens when that third party script throws a JS error in production and you haven’t progressively enhanced the solution? = broken urls.
It’s not for users that one has to worry about when JavaScript is disabled. It’s for search engines.
If content was only enabled through a JavaScript function, it wouldn’t be able to crawl and index it.
I have to agree with Jon quite a bit. People won’t feel the obligation to upgrade if they are continually catered to be the development community. That said, I do feel that more persuasion is better than simply allowing them to load a horrible-looking, non-functioning site.
I’d be much more in favor of seeing the development community universally using a method that informs users they’re not using Javascript and should either update their browser or whitelist the site in their scriptblockers. Being courteous and helpful goes a long way in helping people to adopt new technologies – especially if they’re the people that haven’t felt like upgrading since early IE.
Maybe we should ignore non-JS users, maybe not. It doesn’t matter. You just can’t be right when saying “Web developers need to stop wasting their time”, cause they’re not! Do you even know how hard is to keep all your users satisfied? And do you even know that, if a dev is able to do that, then he certainly has the right to be called an expert, a master, skilled, or however you want to call him. Web developing (which includes designing, programming, combining the two, personalizing your apps, making them more accesible, user-friendly, clean and easy to use, managing content, creating CMS) is not about making websites very very fast just to earn money. At least, in my opinion. If a dev really enjoys doing what he/she does, if he loves his job and it became more than just a hobby, then, he is also, at any time of the day, glad and willing to learn new things, train his skills, become a better designer. It isn’t about…the usage of javascript in 2013, or something like that. It’s about the WISH to code better, design better, develop better. For yourself.
I think many of you are missing the boat here and not understanding the circumstances.
If you’re building a portfolio website to showcase your photography, you probably don’t really need to cater to those who don’t have JavaScript enabled. If you’re building an enterprise-scale web portal serving millions to hundreds of millions of people, then I absolutely think you need to cater to them. While 90% of you out there could care-less about non-JavaScript users, there’s still a large amount of us that do have to support them.
@Jonathan Graft made a good point that many of us who work in the web world still need to develop and code for state DOTs and government websites that have legal web requirements.
Many also fail to realize websites built especially in enterprise environments require strong browser compatibility and progressive enhancement. Denying customers in any shape or form is potentially denying future business. Just because you’re on the latest version of Chrome or Firefox doesn’t mean that Joe Schmoe working on a bank computer at work isn’t still stuck on IE7 and needs to use your website. Joe Schmoe’s IT policy could also have their browsers’ JavaScript turned off by default as well. Successful business relies on catering to your customers needs, not your needs.
Take yourself back at least a decade and further when Flash websites were around every corner. Not one business made a “flash” website, and only a flash website. There was almost always a normal web version. The reason? Many people didn’t have flash, and they didn’t want to wall out potential customers.
While it may be puppies and rainbows to just say we shouldn’t do something or support something anymore, just make sure you are aware of what it might impact before making such decisions.
While I agree supporting these users can be tedious, until the web has moved in a more standardized compliant direction we still need to cater to those not up to speed.
I think as a web developer you should understand how the web was intended to work. Mark Massé explains in the REST API Design Rulebook that:
In other words I think its ill advised to dismiss users who don’t have JavaScript enabled or are on an old browser, and instead develop with graceful degradation in mind.
I agree. If you’re not using JS these days, then well, get off the internet.
Many tracking technologies require JS to do their job.
NoScript is one of the most popular Firefox extensions. Even the German state recommends its use to citizens.
Users that delibrately deactivate JS are more likely to be more sensitive to privacy issues than others, and therefor they may block other statistc methods (like web bugs), too.
So it’s harder to track and count visitors without JS. Many tracking methods ignore them. So such statistics are naturally flawed if you want to know how many of your visitors have JS deactivated.
What browsers does this method support?
Most of the modern ones! http://caniuse.com/#feat=history. Since we’re practicing progressive enhancement, the site will work fine in browsers that don’e support the history API
Small note, but
delegate()
is been replaced byon()
as of jQuery 1.7 – might want to update that one!Good follow-up Jesse. Let me add something to it. When sharing a link on Facebook, it will grab the page Title and Meta Description. By making your site work without JavaScript it’s a good start to a successful social and search campaign.
Besides that, I’ve been involved in a project where we implemented a similar setup about a year ago. Bounce rate dramatically dropped from 43% to 7%. Average pages per visit went from 3% to 14%. Average time on site tripled and page views septupled the first week it was launched. Content that wasn’t able to be share on Facebook was now possible. Google and Bing could get to every article. Less annoying page refreshes with faster load times. Better analytics. Oh, what an experience!
Now, if only we can implement your code into all the photo galleries, slide shows, parallax scrolling, modals, etc. we incorporate into site these days. Think about how easy sharing will become then!
(Without interfering with the back button and allow us to use it on our own terms.)
Hmmm… this would come in handy with infinite scrolling…if that’s even close to possible.
http://www.usabilitypost.com/2013/01/07/when-infinite-scroll-doesnt-work/
Imagine the URL changing as you scroll so you can bookmark where you last left off. Then pagination might not be needed.
You made an excellent point about making the core of your website functional without javascript because of all search engines and social websites. The traffic is too valuable to not enabling your website to be friendly to those javascript-ignoring bots.
It isn’t that hard to do anyways. Not to sound harsh but it’s probably laziness on the side of the developer. You should be building a basic website front-end that functions without JS and then enhancing it with the latest trends like you mention.
Maybe I am just venting because I’ve been running into too many websites with the infinity scrolling where I accidentally click on something and when I go back it starts at the beginning again. NO!!!! Shakes fist at the sky
I agree woth @Jon Hobbs, caring about those users is pretty much useless
However, can’t the problem of landing URL hashes be managed server-side?
Maybe you could but hashes were mainly used so you wouldn’t have to interact with the server more than once—saving on HTTP requests and the annoying page refresh. If this demo wasn’t connected to Google’s CDN for the jQuery file you could navigate the tabs with your internet connection disabled. Save the site locally, disconnect your connection, and give it a whirl. This wouldn’t be possible if the server had to be involved.
Hope this was helpful.
I’m failing the homework assignment. :) How do you keep the popstate event from firing on pageload?
Here’s the trick I’m using:
Argh, I want to edit my comment but can’t. The jQuery function that should be used is
live
(instead ofdelegate
andon
).@Guilherme “on” is still correct. From the jQuery API: “As of jQuery 1.7, the .on() method provides all functionality required for attaching event handlers”. If you wanted to treat on like live, you could delegate it from the body, which is still more efficient than the former live method (it uses document). Obviously the further down the tree you can delegate from the more efficient.
$("body").on("click", "a", function(){});
I am also failing the homework. Mostly because I’m new to javascript. Hows is this fix implemented?
Two notes: the first is simply that I wouldn’t load something as big as Modernizr when you could simply check whether
window.history.pushState
is defined – that script tag is going to block rendering and add some extra server round-trips to replace a dozen bytes of JavaScript.Second, in response to the people who advocating ignoring JavaScript-less users: remember what happened when Gawker, Twitter, etc. did this? Any bug in your JavaScript, the user’s network connection, etc. breaks your site and makes you look bad – which is key to why everyone has backed away from that strategy. Put another way, if you use progressive enhancement your site still looks great but doesn’t require extra engineering work to load quickly, be robust, work well with search engines and other robots, etc.
The thing with using Modernizr is that they always go the extra mile and deal with quirks in feature testing that you might not have thought of.
In this very example, History has a quirk:
https://github.com/Modernizr/Modernizr/blob/master/feature-detects/history.js
@Chris Coyier: good point. In this case, the actual feature detect is under 200 bytes, which is around the point where I’d inline it to avoid the extra RTT. The HTTP request headers alone would be more traffic…
@Rob: the main point was that a script tag blocks further page loading & rendering until the script has completely run. If you need to support vintage Android releases, I’d inline that script to avoid punishing everyone else.
You dont need to load the full modernizr file, in the site you can choose whatever option you need and download a small js
Custom build of Modernizr FTW! I typically start with a full build, and then pre-release check what I’m actually using and create a custom build with just that. Inlining the script is still blocking by the way.
Another reason to use Modernizr – it includes the html5shiv (unless you explicitly ask it not to), so you can cater to pre-HTML5 browsers who don’t understand <section> etc
I agree with the sentiment that users who have js disabled will have a horrible web experience in general. Having said that, I think it is good practice and reasonable to expect that the simple showing and loading of content does not depend on javascript.
Anyway, I had no idea it is possible to change the URL without the usage of hashed and without refreshing the page. Very cool tip.
Sorry, maybe I’m missing something… but if a user’s JS is disabled, how does a JS-based solution make things better?
As I understand it, the problem here is the URL sent from JS user to non-JS user. This fix creates a URL that’s friendly to both. As Aaron pointed out, it makes the content easier to share.
It doesn’t do anything special for non-JS users browsing the site themselves, but that’s irrelevant because that’s a different scenario than the one outlined above.
I to don’t see why the push state and pop state is so much better the #tag suffix to the URL.
Don’t forget that when loading dynamically content onto the page, you will load part of the HTML page. If you make non javascript friendly URL, you must implement on the server a second page to show you the full page also.
That means a lot of work is required to support both clients with and without javascript !
Not worth the effort for me.
I support those that said forget about non-js users. It’s unlikely that these users will be trying to go to your fancy modern website without javascript enabled, and it’s unlikely they will care about links to such sites from their friends. For modern web development I would rather believe that js is a standard capability, and reality is so close this ideal that it might as well be considered true. No need to waste time on features with so little benefit.
I read your article carefully, unfortunately a bit too technical for me. I’ll try to inform myself better.
Really interesting write-up… thanks for the post!
How about a double fallback? Order of: pushstate, hashbang, normal.
You will greatly decrease the risk of someone getting a non working url on non-js-purists instead of removing the risk at all. But you give about 10-20% of your audience the same experience as the modern browser users. Which can be very important for UX if, like in my case, the designer wanted slow changing background images.
Also, it’s often forgotten to update (things like) the document.title(always use native JS for this). Otherwise bookmarks get a broken title as well.
The code
will produce an evil global.
In case anyone is wondering: if you miss out the var keyword in JS, it is an implied global
I’m using this library for cross-platform compatibility
https://github.com/balupton/history.js
it also has an optionnal fallback using #hashtags to get the variables back on non History browsers. The hashes are not really elegant but it does the job.
I don’t know of any other lib, this one hasn’t been updated for a while but it still works
I’ve been using this on http://www.area.fr/ (without hashtag fallback support though)
For those “visitors with JavaScript disabled” haters…. When I create a website I rarely, if ever, think about those visitors. But somehow, the final product I deliver to the client is progressively enhanced. Why? Because the client cares about sharing and getting properly indexed on search engines. If your client cares about search and social you have to consider using progressive enhancement. Charge more, educated your client—whatever it takes. Not all our clients are like Nike (nikeworld.com) and can get away with it. They were the first to have parallax scrolling. They have Kobe!
Now, I’m also working on different projects that use Angular.js and Handlebars.js. I took the offline first approach. http://developer.chrome.com/apps/offline_apps.html which means this app will not work AT ALL when JavaScript is disabled. But that’s okay because we don’t want social and search to play a role for an app that only wants humans to access it.
Hope this was informative. (:
This is just awesome. No any fuzzy hashtags in the site URL, and the UX is also cool. Both of the examples are good.
I do not work much on non-JS users, but mostly for cross browsers; and it work fine with IE7, IE8 and above.
This article is extremely timely. I had just had a discussion with a client about single page vs. multiple page and javascript vs. non-javascript browser support. Every website I build works with JavaScript turned off. I like it that way. I can’t dictate how users browse the web and Google Analytics can’t tell me users have JavaScript on or off, because GA can only track users with JavaScript on. Modernizr has made this approach easy, allowing me to write my CSS with functionality in mind.
I will have to give this tutorial a try. It looks very clean.
This doesn’t seem to work with IE 9. Instead of loading the content dynamically, it goes straight to the link in the href.
My friend doesn’t have javascript enabled. She opens the link in her browser, and is confused that the awesome product doesn’t load as expected.
Who are these people? And why are you friends with them?
Sweet! I’ve had a situation where a client taught the url with the hash sign was a sign of an error and ugly looking too, I tried in vain to convince him. That was the last time I implemented ajax for anyone. This seems to be the answer to that, Thanks Chris.
well unfortunately some methods like “history.pushState()” isn’t supported by old browsers …
@ Chris (#comment-251890). So you mean using something like
if (history.pushState) {
...
}
instead of modernizr won’t account for “quirks”?
Why not just have the whole site made and replace the hrefs with hashes with JavaScript on load. This way you still have some nice fancy hash things or am I missing something here?
How does this affect analytics? Does history.pushState count as separate page-views?
In Firebug’s Net panel, I notice clicking links doesn’t show a new page load…
Wait, how can you use the HTML 5 history API when you need JavaScript to do it? Isn’t the purpose not solved that way?
If you go to http://www.danperceval.com, you’ll see that the image slider (of the cars ) works on the main content area. But if you click on ABOUT, it doesn’t work, but the CSS does seem to work. And in Chrome and Safari, the image slider just doesn’t work period, even on page load. Loading content using Dynamic Page Script doesn’t seem to load or work with jquery or any scripts. Also, I’m a beginner in CSS and no nothing about scripting whatsoever. Could you please look at my page (www.danperceval.com) and see if you can figure out why jquery isn’t working? Thanks.
I’m using this script but it’s interferring with a lot of other random things on my site. For example, Shadowbox and the Google Maps API don’t work with this
$(window).bind(‘popstate’, function(){
_link = location.pathname.replace(/^.*[\\/]/, ”); //get filename only
loadContent(_link);
});
Also, I’m having problems linking within certain sections as the file name doesn’t resort back to the root directory.
I’m generally a fan of progressive enhancement, but the problem with the history API is that you may not be properly balancing the number of IE8/IE9 users against the number of visitors with JS turned off. If your site has a fair number of IE8/IE9 users, you just downgraded their experience big-time when you forced them to post back every time the click on something. (A fair % of mobile users also still have browsers with no history API, if mobile is your thing.) Most/all of these users could have had a better experience with hashes turned on.
Best practices are great, but for me, the ultimate best practice is providing the best experience to the greatest number of visitors to my pages. If I can provide a great experience for 99% of my visitors, and 1% (the luddites with no JS) get an error screen, I’ll take that. That’s far preferred to providing 80% of my visitors with a great experience, and 20% with a working, but crappy, experience.
(I’ve also seen developers get themselves into trouble using approaches like this (e.g. Jquery Mobile). Having to do HTML surgery to support dynamic loading of full-blown pages is usually more complicated and error-prone than loading and rendering snippets, or just showing and hiding content from a single page)
If a specific page has additional JavaScript and/or an additional stylesheet, is there anyway to load those as well?
var $myNav = $(“.myNav a”),
$mainContent = $(“.main-content”);
Maybe it’s me, but I think this makes it look better, and it plays better with bootstrap.
Tightened the code up a bit
If you do not want a full page fade in you can just remove that line.
Have fun.
I’m still a bit new to js Chris Coyier or anyone can u plz post the homework assignment Answer(s) non of what is posted works properly
Thanks much in advance
I’m finding that this breaks when the links that trigger the dynamic content replacement are within the
<div id="guts">
This didn’t appear to be a issue in the previous version. No idea how to go about fixing this. Any advice would be appreciated.
This is why I stopped using it completely.
Depending on your site, if you are using dynamic includes(E.G. ?page=Example) and the only fix I can think of is reloading the script inside the script and that leads to a nasty memory leak.
But if your links lead straight to the file(E.G. \example.html) then you can modify this line like so:
to
Because you are loading the html file directly you do not need the code to look for .guts within the entire page, unlike php includes which puts the entire page together before the js can get to it.
Your solution is easy, clean and clear. Thank you very much!!!
when i preview locally in chrome it doesn’t work :(
safari is fine, but in chrome ‘guts’ goes to empty and stays empty
am i doing something wrong?
Link: http://stackoverflow.com/questions/23629244/popstate-html5-and-ajax-load-content
We have a problem with popstate event. We make a mini website with this new amazing propriety and I had no problems until i made a contact page with Ajax response.
my script.js (i removed unnecessary scripts and control) and html code. When i try to go in contact page directly, i had no problem with ajax call. Write all information, click submit button and the ajax call the php script (in background) that send email without change the page.
The problem begin when i try to go another page, then click the contact page link. Now when i try to click submit button, ajax call don’t trigger and the browser (Chrome and Firefox) call directly php script in action form changing page (not in background). We have any solution??? Thanks very much!
$(function () {
}); //end submit event handler
});//end function
I just add this to a wesbsite I’m currently buildung and evertyhing works fine except oine thing:
At the first load of the index page the slider I’m using works but when I click again on the link to the index page, the slider script doesn’t load so I only see the list items and not the slider.
Any ideas on that?
Here’s the link to to the page: http://www.level26.at/testsites/dj-yeezy/
I’ve used this script with sucess in several projects.
Now I’d like to use it with joomla!3.
Is there a way to adapt it to joomla?
My site uses different javascript on each page because of the necessary content. How do I load the script for the content with the content?
…
I’ve tried referencing all the script from the index, I’ve tried referencing the script inside the loaded content, I’ve even tried script replacement by creating a script variable with an ID and swapping it from the content div.
Any help would be dearly appreciated.