We're in the future now so, of course, we're working on ways to speed up the web with fancy new tactics above and beyond the typical make-pages-slimmer-and-cached-like-crazy techniques.
One tactic, from years ago, was InstantClick:
Before visitors click on a link, they hover over that link. Between these two events, 200 ms to 300 ms usually pass by (test yourself here). InstantClick makes use of that time to preload the page, so that the page is already there when you click.
Clever, but not as advanced as what can be done in these modern times. For instance, InstantClick doesn't take into account the fact that someone might not want to preload stuff they didn't explicitly ask for, especially if they are on a slow network.
Addy Osmani wrote up a document calling this "predictive fetching":
... given an arbitrary entry-page, a solution could calculate the likelihood a user will visit a given next page or set of pages and prefetch resources for them while the user is still viewing their current page. This has the possibility of improving page-load performance for subsequent page visits as there's a strong chance a page will already be in the user's cache.
Another contender is Quicklink by Google:
Quicklink attempts to make navigations to subsequent pages load faster. It:
- Detects links within the viewport (using Intersection Observer)
- Waits until the browser is idle (using requestIdleCallback)
- Checks if the user isn't on a slow connection (using
navigator.connection.effectiveType) or has data-saver enabled (using
- Prefetches URLs to the links (using
<link rel=prefetch>or XHR). Provides some control over the request priority (can switch to
No machine learning or analytics usage there, but perhaps the most clever yet. I really like the spirit of prefetching only when there is a high enough likelihood of usage; the browser is idle anyway, and the network can handle it.