It should be as easy as clicking the first hit in the search results, but it's not. Perhaps people think that Google will take care of their blog. I am not so confident in Google's ability to keep my blog online. Too many people have lost their blogs to unexplained policy violations. Plus there are the usual hacks and screwups that can destroy the blog regardless of hosting quality.
The first hit for my search queries was BlogBackupr, but I don't trust it, because it's some fast-cooked startup. Its financing is left unexplained. It's not clear whether the data is versioned and whether edits are backed up. It's a nice secondary backup though.
Manual backup methods are completely unacceptable. I don't have time nor inclination to become a backup robot. I am therefore ignoring all advice that suggests backing up the blog manually. There's plenty of such advice. I wonder how many people are willing to put up with it.
Blogger has builtin Export function, which is reportedly importable into other publishing services, but that's too manual. Not sure how images are handled. There's a tutorial on how to automate it with curl, but I wouldn't trust it to be stable long-term. The exported file cannot be immediately published should the blog go down. It's more of a migration tool than backup tool.
Then there's the RSS-based BlogBackup script, which is fully automated. It's a little dated though and there's no guarantee that it will keep working.
IFTTT is an interesting automation technology, but it seems to be an overkill for blog backup.
Backupify is nice in that it exists at all, but it's too expensive at $3/mo for personal blog backup. Think how much it's going to cost long-term for several blogs. There are tons of similar services too.
There's Blogger Backup app for Windows (Blogger-only), which saves the backup to local RSS file, but it looks terribly outdated. Its incremental nature also makes it hard to see how it handles post edits.
And then there are universal website mirroring applications that support automation via command-line parameters, notably HTTrack and wget. Their advantage is that the downloaded site can be quickly published elsewhere should something happen to the blog. They are idiot-proof and long-term stable. They will also work with non-blog websites where RSS feed isn't available. Someone posted the combination of wget switches that will do the backup in one go.
I will stick with wget since it's the simplest tool that can do the job and I already know it. Once I have the blog replicated in local folder, I can hook my ordinary backup on it to create version history. I will use BlogBackupr as my secondary backup just in case.