I can’t stop thinking about this site. It looks like a pretty standard fare; a website with links to different pages. Nothing to write home about except that… the whole website is contained within a single HTML file.
What about clicking the navigation links, you ask? Each link merely shows and hides certain parts of the HTML.
<section id="home">
<!-- home content goes here -->
</section>
<section id="about">
<!-- about page goes here -->
</section>
Each <section>
is hidden with CSS:
section { display: none; }
Each link in the main navigation points to an anchor on the page:
<a href="#home">Home</a>
<a href="#about">About</a>
And once you click a link, the <section>
for that particular link is displayed via:
section:target { display: block; }
See that :target
pseudo selector? That’s the magic! Sure, it’s been around for years, but this is a clever way to use it for sure. Most times, it’s used to highlight the anchor on the page once an anchor link to it has been clicked. That’s a handy way to help the user know where they’ve just jumped to.
Anyway, using :target
like this is super smart stuff! It ends up looking like just a regular website when you click around:
Fuck – love this.
I like the whole idea, its almost perfect, but…
lazy loading is not supported much
https://caniuse.com/loading-lazy-attr
Safari will choke loading all the stuff at once, especially on mobiles, am I right?
No disrespect but… that is NOTHING.
Check out tiddlywiki.com if you want to see the most powerful single html file you can imagine.
But there is like… a bunch of JavaScript.
Another of the “why the **** I didn’t came up with this” kind of stuff.
Clever.
Exactly how we made Flash websites back in the day. Complete with hash bangs and everything. What could go wrong?!
What about indexing display:none; sections by search bots? Wouldn’t they ignore it?
No, IIRC search engine indexing bots don’t run CSS, they look at the source code.
I’m not entirely sure that’s true. Yes, of course, bots look at source code. But they know what hidden content is. They run JavaScript. They are very smart, and if you’re doing something that very few other sites do, I think it’s fair to worry that a bot designed for billions of sites might not work properly on an outlier.
Instead of using
display: none
, hiding the sections inclusively (https://www.scottohara.me/blog/2017/04/14/inclusively-hidden.html) should also fix any possible issues with web crawlers.Très interesting! Unfortunately, unless the page title is also updated, the browser (or back button) history quickly becomes completely unusable.
Back button and history work fine for me (on Firefox) with the example website in this post.
but think of the unnecesary traffic. If I want to see the content of only one page, I load all the content for no reason. You might say, that doesn’t matter in todays broadband lines, but what about mobile users?
You can have it with Eleventh SSG too: Solo
I use a similar trick in a website of mine: almost the entire website don’t use JavaScript except by a permalink page, but there is a pure CSS fallback in case JavaScript don’t works for any reason.
You can check it here (the content is in Portuguese, by the way). Works with or without JavaScript. Requires a click, but works.
All these years and I can’t believe I never considered using
:target
in that way!Very interesting indeed.
Supported across the board too
https://caniuse.com/?search=%3Atarget
I think you can like this little crazy project, made 10 months ago. This a ‘pixel drawer’ only with HTML/CSS: https://sdp.iglou.eu/
Or my old personal website With 0% JS and a lot of anchor usage: https://iglou.eu/
Very interesting concept! I think it could replace JavaScript for partial or alternative content when appropriate. Like displaying tagged items from a blog post list or navigation on small screens. The kind of content that wouldn’t necessarily require their own title and meta tags. Probably a good alternative to a full page reload in some circonstances in a Jamstack architecture. Thank you for sharing!
Is it possible to (css) select the current selected link?
This looks like an SPA to me. It even (literally!) has CSR.
Very cool.
Coupled with a good SSG, a one page website is a winning ticket for a tiny project documentation or a developer blog.
No need to worry about client-side JavaScript.
I developed Solo with this in mind within a few days after having discovered Gregory’s repo.
Based off Gregory’s awesome concept, I built out a single file blog generator PHPetite.
https://phpetite.org
(Just for anyone reading through these comments in the future)
Hate to sound old, but this is how we built sites back in the day.
That’s fun. I did something surprisingly similar in 2002-2003 (with the help of basic JS) for a project at the university. My tutor told me then « Interesting but too weird to be generalized »
When I saw the title, I thought this would be another SPA tutorial.
Nicely done.
I elaborated on the idea and created a website dedicated to these kind of websites. You can find it here:
https://www.zengardenwebsites.com