The idea is that there is a significant delay between hovering over a link and clicking that link. Say it takes you 300ms of delay. That 300ms could have been spent preloading the next page. And if you do use that time preloading, that page loads that much faster.
This new project makes use of newer tech to get it done. It's hardly any code., the core of which is appending a
<link rel="prefetch" href=""> to the document of the link you're about to click/touch.
integrity attribute means that if you trust the code as it is now, it can't ever change unless you change that attribute along with it. It also cleverly uses the
type="module" to prevent it from loading anything in browsers that don't support prefetching anyway.
Still, you could self-host it if you wanted. I have no idea who's ponying up the for the bandwidth here, so another risk is a hung script should it stop responding one day.
You could argue that it doesn't do the prefetching as absolutely responsibly as it could. Google's similar quick link library (which we covered here) does two interesting things in which to attempt to be more responsible with prefetching: 1) wait for
requestIdleCallback and 2) respects info from
navigator.connection, like a user enabling data-saver mode.