I’ve heard people say that the #1 most exciting and important thing that came out of Google I/O this year was the evergreen Googlebot:
Today, we are happy to announce that Googlebot now runs the latest Chromium rendering engine (74 at the time of this post) when rendering pages for Search. Moving forward, Googlebot will regularly update its rendering engine to ensure support for latest web platform features.
Before this, I guess I never even thought about it.
I guess part of it is that some people did already know that the old version didn’t support some newfangled JavaScript stuff, and thus literally packaged their app with old JavaScript to be more SEO-friendly.
A bunch of people were apparently shipping older code simply for the sake of Googlebot, and now they don’t have to. Sure, I’ll call that a win.
Don’t read this news as “don’t worry about your JavaScript-rendered pages and SEO” though, because Google Webmasters is still telling us that pages with content that requires JavaScript to render are put into a special slower queue for both initial crawling and for updates. Not necessarily a penalty, but certainly a delay. I’m sure that’s enough to make server-side rendering a priority for sites where SEO is the whole ballgame.
Does this mean that the details / summary element can now be used without the details being considered hidden content? I have tried to find something, ANYTHING that discusses how Google Bot views elements that have hidden text but are native html elements but these articles typically only cover traditional javascript accordions.
Great question. Was there evidence Googlebot ignored content in those elements before this change?
This is the only definitive source I have found from Google comment one way or the other from 2017, which says it is treated the same as traditional approaches to hidden / toggled content with JS: https://twitter.com/JohnMu/status/891695972971053056