Websites, especially blogs, are slow. I have super fast network connection that is left unused, because the bottleneck is in server-side content management systems, scripting languages, and databases. In this world of neglected sluggish servers, client-side prefetching comes to the rescue. It sounds great in theory, but where are all the practical implementations?
Let's consider alternatives quickly. Slow websites cannot be avoided, because they contain unique content. I cannot speed them up directly, because they are under control of someone else. Nobody is operating global cache that would have these pages already prefetched. Client-side caches are already implemented, but they are increasingly ineffective.
The idea of client-side prefetching is simple. Browser has plenty of information about user's recent actions, which can be used to predict what the user will do in the coming seconds. These predictions can then be used to fetch content speculatively.
There is Fasterfox add-on for Firefox. Add-on maintainer doesn't make it clear how this extension works, but there's a nice summary of Fasterfox prefetching on Wikipedia. It's very inefficient. Its prefetching abilities are limited to statically looking URLs of caching-friendly websites, which is a small part of the Internet and it's usually exactly those sites that don't benefit from prefetching. Fasterfox is also very costly, because it prefetches every single link on the page.
HTML5 allows marking of links for prefetch, but this is the kind of feature that big well-maintained sites are likely to use. Blogs will never use it for internal links. If Google is one of those big sites, then this feature could improve speed of browsing through search results. Google however uses the prefetch tag very sparingly. I guess it will have negligible impact on browsing experience.
Chrome browser is much faster than Firefox and one of the reasons is that it performs speculative requests based on user input. It's not really page prefetch though. Chrome resolves domain names, opens connections ahead of the time, and performs similar low-level optimizations. These actions are initiated by user behavior, e.g. hovering on a link or typing URL that auto-completes with high hit ratio. This shows that there are many optimization opportunities that can be utilized before full page prefetch. Chrome is unfortunately much slower than Firefox when ad-blocker is installed.
Currently no browser and no extension implements proper predictive prefetching the way I imagine it. Nevertheless I don't see any reason to encourage site maintainers to optimize millions of websites manually. There are tons of optimization opportunities in browsers that are cheaper to implement and much more likely to materialize in the future.