I recently had the problem, that I did not do any kind of backup of this site. The blog has grown since I began blogging again about a year ago and especially in the past few weeks since I relased messie and highscore under the MIT license.
I don’t wanted to loose all posts and backup at least the database (it’s a MySQL system). In the past I dumped the complete schema and downloaded that via SFTP onto my local machine, but it’s always the same: You set it up and all is ok and it runs for a few months. Then something unexpected happens and the dump is not downloaded but you don’t have time nor want to do that whole thing again and see what failed and so on. That goes on and on and on.
The other day I read this blog post about using git for other solutions than only version control. And then it dawned on me: why not use git and push everything to a remote repository. Git only saves the diff between two versions, so it wouldn’t cost that much if I want to have a backup made hourly.
I already had an account on bitbucket.org (check it out, it’s free and you can have as much private repos as you want!), so I created a new for my backup.
A cronjob then dumps the database, commits the dump and pushes the new version to bitbucket. It’s that easy and it just costed me 10 minutes to write a short bash script for that and check that it works.
Sometimes, the solution isn’t that far away but one have to think outside the box to reach it
My Coderwall Badges