What if you hosting provider calls you one day and tells you they had a catastrophic fire at their facility and every bit of data on their (your!) servers is gone!

It’s highly unlikely, of course, and it’s more likely that your host does a better job of backing up data than you do. But… it’s certainly possible and ultimately, you are responsible for your sites. If you do client work, the duty of backing up is even more important!
So how about it folks? How do you back up websites? Do you do it at all? I have to admit I’m not as good at it as I should be. What I do is periodically download the entire contents of a site (include database dumps) and throw them in backups folder locally. My entire system is redundantly backed up, so I figured I’m cool there. I wish I was using something a bit more automated though.
And the choices are…
- I don’t. (you should!)
- I download the files locally. (and back those up)
- Doesn’t my host do that for me? (maybe, you should check)
- My host does this for me. (I checked)
- I have some kind of automated solution. (do share!)
The actual poll is over in the sidebar! RSS Readers will have to make the jump over to the site to actually cast your vote.
I just download all of the files locally. Burn them to a cd and label it very clear. I do this usually once every 2 weeks. Depending on changes made to sites. If a client doesn’t update, it’s kinda pointless to back it up… ;)
we use an app that makes a cpanel backup of the entire site (for each one) and downloads it to that machine. then we use jungledisk to upload to amazon s3. this way we have a copy on a machine at our office as well as one stored offsite.
Web server has copy of production files and databases.
Production and content files kept local on mirrored drives; then I also sync to a laptop (MS SyncToy).
Databases are downloaded to local content folders periodically.
DVD backups are done periodically.
My host does this for me, and I save backups every week locally. And every month I burn them onto DVD-R.
“I download the files locally. (and back those up)”
I download the files via FTP (using my host’s site backup compression tool), and then backup the local files using Carbonite.
I use subversion for the code of any website. For actual content I do automated database dumps that get committed to subversion as well. For files there’ll be a .zip file containing all which is manually stored.
My host (Dreamhost) does this for me automatically.
If you have Transmit you can use Automator to sync the online files to a specified local folder. Put this action in iCal and you have an automated backup solution. I believe you can also create a mirror instruction script with Fetch and use that in ical
manual downloads & then i rely on mozy to back that up. works pretty good so far.
I run a Ruby script on my webserver that compresses my domain folders and databases and automatically uploads them to my Amazon S3 account. It was incredibly easy to do (and I have full instructions/screencast here http://www.christinawarren.com/2008/06/24/s3-backup-media-temple-gs/ — they are aimed at (mt) media temple customers but the script can be modified to run on any server that has Ruby installed) and I get and updated copy every day on S3. I have it set to replace an existing copy (I’m not doing anything where I need recursive backups), but it would be easy to set it to store 7 days worth at a time before overwriting.
Donnie — you could avoid Jungled Disk altogether and just have a separate CRON job that sends the backup straight to S3. It would certainly be faster/more automated. I think someone even wrote something up for people using CPanel and S3Sync (the Ruby utility I based my script on).
I can’t stress enough how important it is to have backups. Even if your host does it, consider doing at least a monthly database backup, just in case.
I use rdiff-backup for both sites and databases.
I develop my website offline, using a local webserver to run the scripts that drive it. I then upload changes to the live site via ftp. This approach gives me a complete offline mirror of the site, which gets included in my (admittedly sporadic) backups of the hard drive.
rdiff-backup, cronjobs and ssh does the trick for me. the databases are dumped daily using mysqldumper…
Using cron jobs for creating backups every 24h, creating database dumps, backup all server and software settings, backup maildirs … create tarballs of each backup and zip everything at the end.
Download them regularly and then and burn them to DVD from time to time.
Do not rely on “the host does it for me”. I know a couple of people who lost all their work because of this.
I backup my website folder every sunday @ 2am automatically on my 500gb external HD.
I Burn it on a DVD every so often, 2 copies 1 for this house and the other is kept at my uncles.
I am also looking into idrive and xdrive, you can store files up to 2gb or 5gb for free.
Rsync to a linux server at home.
Carbonite! Online backup app for about $50 a year. In addition, most of the hosting companies we use provide back-ups. AND, I usually make archive copies locally when the site undergoes any major changes.
I use a wordpress plugin to backup my content. Then I have my theme files on my local machine since I develop the themes locally.
My host does this for me. (I checked)
I am seeing some great ideas here. I hope that Chris Coyier tries some of these out and dose a ‘how to’ post on the one he found that worked best for him.
I “Make a Ready Defense by Planning for Failure” … that is I crontab a little bash/MySQLDump solution every night …
… monthly storing any markup changes to my sites simply via the miracle of the tar command.
First, I keep all my sites in SCM. Git is my preferred SCM. I keep my Git repositories on my linux server here in the office. So not only is everything backed up there, I have a complete history of additions, deletions, and changes. SCM may sound like overkill for static websites, but it’s not. Subversion is good too, but branching is a pain.
Second, I love Time Machine. At first, I thought it was just a gimmick and too simplistic for how I work. I stand corrected. TM makes getting back to a known state or previous version of a graphic not only simple, but very quick. And it’s farily unobtrusive.
Finally, I use Capistrano to manage most of my websites. Capistrano is typically used by Ruby on Rails developers, but it works just as well for any website. It allows me to execute a series of remote scripts via SSH on the server to wrap everything up in a tar ball and SCP it back to me. SQL databases included. I can take the side down for maintenance, push updates, rollback to previous versions. Trust me, once you capify, you won’t go back. So where is the backup in Capistrano? On the webserver. You can tell Capistrano to keep X number of versions on the server, compressed if you like.
I keep local copies on multiple machines (work, home) and removable media and keep them all organized with Subversion, which is hosted by my webhost. Old stuff gets scrubbed from the local copy and shoved on a heavily compressed DVD backup semi-annually.
And I took Mean Dean’s advice a while ago – crontab and bash/MySQLDump nightly. Is that overprotective?
I get obsessive about backups. For each of my sites, I backup two copies (zip and gzip) of my databases after every post, significant round of comments, and/or plugin changes. These backup files are stored on my local machine, which is backed up with a secondary external drive. Once a week, I also synchronize and backup all of my sites’ files. Finally, once a month (roughly), I backup everything onto a third external drive that is kept locked up in a fireproof safe. Great topic, Chris — worthy of more coverage for sure..
I print it out in binary, forcing the fifty korean children locked in my basement to remember the digits (else they won’t be fed). The basement is then secured by a vault door only penetrable by nuclear missiles or raving penguins with chainsaws.
Okay, on a serious tone, I usually backup automatically through the CMS’ plugins, or set up a cronjob with automatic mailing functionality. That way I receive backups on mail which is automatically put in the right folder. Easy and it works.
I use the wp-backup plugin for wordpress ;)
And for the files, I download them :D
The websites I make are mostly just HTML/CSS, sometimes with simple PHP. I like to use all of my website projects as a reference when I’m working on new projects (you know, when you have that “How did I do that before?” moment), and I do freelance both at home and at work, when I have the time. I’ve got a copy of all of my websites locally on both my home computer and my work computer – I figure that should keep me safe enough.
“I have some kind of automated solution. (do share!)”
I use a WordPress plugin called [BackUpWordPress][1], which automatically backs up my database and all the files on my server, and serves them to me i a tar.gz files which I can then download. Te tra.gz which are further more backed up using Jungle Disk, which I use to backup my laptop.
[1]: http://wordpress.designpraxis.at/plugins/backupwordpress/
We use Dropbox, which provide us multiple computers syncing and file versioning. It maps a folder in your system (Mac, Win, Linux) and you work directly on that folder and the files changed are automatically commited to the server and all the other computers in the Dropbox shared folder. Useful.
My host backs up weekly, but this poll has me thinking…
Seems like it should be pretty simply to create a web page (php in my case) that simply dumps the entire site database, e.g. as an XML file. Then create a Google App Engine application that uses their fetch URL API to query this page and store the results in the Google App Engine datastore. The free account is limited to 500 MB, but that should be fine for a lot of sites.
Only question is the most reliable way to periodically trigger the Google App Engine fetch. You could set up a page on the Google App that triggered it, and then a cron job to periodically query the Google App page, but that seems clunky. Maybe there’s some way to periodically run a command natively in the Google App Engine API. I have no Python experience, so I don’t know where to start looking for that.
This would give you an automated off-site backup to Google servers. Probably more reliable that storing it locally unless you’ve got a RAID array set up.
I used to just keep a copy on my HDD.
But now I use subversion.
I still haven’t found a good way of backing up a db though… I might have to learn how to do cron job for this thing.
I actually work off my computer and upload the changes I make to my website to my host’s servers. I back-up my computer on an external drive.
My host does this for me (Host Europe). I simply download the compressed backups from the Ftp – very comfortable.
I use an automated solution for file backup from/to server. The software is named Allway Sync, its simple and it works.
A software to make automated backups from MySQL and MsSQL would be great.
I currently use r-sync to backup to BQInternet.com which is hosting in New York. My dedicated server is in Dallas, TX.
I’m in the process of buying my own server and co-locating it in Dallas, TX as well but the new server will be geared with (4x) 146GB SCSI drives in RAID 10 which will provide some degree of on-site backup as well as continue using the server from BQInternet for off-site backups.
A few things:
Firstly, I use subversion to keep a server-local backup, in case I need to boot off another copy of a site (and sometimes, I keep it on a separate domain’s server as well). Second, my server is automatically backed up by my hosting company. Third, I keep all of my sites in a local copy on my iMac. Fourth, that local copy is stored on an external hard drive through Time Machine. Fifth, I keep one more copy on my MacBook, which is automatically synced via Unison with the iMac, so it is always a current or almost current backup.
Well, as I develope almost 99% of my stuff locally my data are stored and backed up locally and daily (Time Machine for the win! ;) ). Except the databases. I use mysqldumper to easily create backups every week, download them and store it on a backup drive. So … well, double saved, because my host cares about such things, too. :)
I keep a local copy and I use a program called Syncback to make a redundant copy on a removable (USB) drive.
I’m pretty plain when it comes to this…I just download all the files into a folder locally. Sometimes I’ll then put it onto an external HDD depending
@Everybody: Thanks to everyone for sharing! Wow, lots of good ideas in here. As I kind of mentioned in the article I just kind of periodically download local copies and let my normal backups deal with it. But… I think this has inspired me to do something a little more automated and hardcore.
I have my blog’s database backed up every night (I get the sql file emailed to me daily), and download all of my important files once a month. The SQL files are also stored by Google because I use Google Apps for my email, and the theme files are backed up on another server (by another host that I use for a different reason) and on my external hard drive. Occasionally I’ll upload a backup of my theme + wp-images folder + sql file to Dropbox. I’m a little bit obsessed. ;)
I actually develop everything on my Mac running Leopard and similar to your image it’s all backed up with Time Machine. Using MAMP to design and develop multiple blog projects allows me to keep everything in a local WordPress install and it automatically gets backed up.
As for the database, I do it all manually right now but I am looking at using Automatic MySQL Backup to see how well it works.
Plesk for weekly backups of server config stuff and source code, and a tiny little script I wrote to dump my DBs every night, and send them to a remote server: http://nicksergeant.com/blog/programming/quick-shell-script-backing-databases-ftping-them-remote-server-and-notifying-me-any-file-changes
Oh, and everything is Subversioned-up, too. Though the repo is on the same as my prod server, so that’s a point of failure, but I have the code checked out on a local machine (and Time Machine backed up).
At the moment there’s not much going on besides my blog, so I just download the export/backup file that wordpress can produce.
I plan to build similar functionality into my own cms when I get round to it.
I develop offline. I just need backup the databases manually.
I download copies locally and then have a mozy.com account that backs them up for me. While developing a new site I use my dropbox account to sync them to 3 different computers. So even if a hard drive crashes I have 2 other options that have very recent versions of the site I was working on. The only thing that really takes a while to sync with dropbox is a .psd. Everything else is near instant.
SVN.
each version is backed up and notated manually, a back up of the back up is automatically made on one of our servers.
to keep all our work in-line between the diff people working on a given project, we checkout the repository or upload to the repository. this way we’re always working on a communal up-to-date version.
I’m using normal export function of wordpress. Sometimes (about 3 times in a year) I download full local folder. If my hosting does this for me I dont need it:)
I think I might have a bit of over kill when it comes to backups.
My host does backups (I’ve checked).
I have an Automator script that uses Transmit to sync files to my Mac nightly and the Mac is backuped with Timemachine. The WordPress plugin Database Backup also running nightly and emails it to my gmail account.
So for me to lose my site would entail a disaster that destroys both my Mac and the TimeMachine dive at the same time my host is destroyed. For that to come about some very bad is happening in the world an I think I would have bigger things to worry about then my website.
I always make DVDs for backup of my websites on a monthly basis including all files and databases.
I backup my web files on an external WD hard drive in my office, and I backup often. I also compress the files and upload them into a secure folder on one of my personal domains.
I’ve made the mistake of not backing up, like many people have. I have no intention of repeating that mistake again.
I’d love to start using Subversion to back up, especially now that Coda has support for it.
The only problem is that for the life of me, I just can’t understand how to set it up. Every ‘set up’ article looks like it’s written for developers who spend their entire lives in the Terminal. All these ssh strings are nonsense to me.
Now if someone would like to write an article that normal people (ie. front-end designers!) can actually understand – that would be seven shades of awesome.
@Nathan
SSH is something you’ll have to get used to with Subversion. I would recommend learning a little bit of SSH first (it’s actually quite simple once you get the hang of it) – there really isn’t all that much to learn.
As far as Subversion, that can be a little more involved. I don’t grasp everything with it, just enough to do what I need. Check-in (ci), Check-out (co), Export, that’s about all. Capistrano normally takes care of the rest for me.
@Nathan:
There are several GUIs for Subversion that make managing a repository a little easier. But to be honest, you have to understand the fundamentals of SVN before the GUIs will make any sense to you. Keep in mind that just because you have a project under version control, that doesn’t mean it’s backed up, unless you are using a remote repository and/or server.
The real benefit of having your project in version control is that you can roll back changes that you have made (ex. made a huge mistake the client is furious about). With just a couple of commands, you can be back to a working TAG. Then go about fixing your problems in the TRUNK, or make a BRANCH to fix a specific problem. Them MERGE all you changes back together. Sounds difficult, but it’s something you end up doing several times a day, so you get good at it pretty quickly.
I’ve switched all my projects to GIT and like it quite a bit better. It’s much faster then SVN and I like the visual tools that show you branches. GIT was developed to manage the version control for the Linux kernel, so it has quite a bit of weight behind it.
So go ahead, jump into the Terminal, the water is nice!
Well rather than diving into the terminal then, are there alternatives versioning systems which are more straight forward?
You can find easy web-based setups (sometimes even one-click!) for things like content management systems, databases, etc – why isn’t there anything like that for setting up versioning? It really strikes me as being way more complicated and involved than it needs to be. I’ve tried doing the SSH setup through the terminal and a) it didn’t work, b) I had no idea why it didn’t work because I didn’t have clue what I was doing, and c) the guides for my server for setting it up (and any guide for that matter) always assume you know what you’re doing.
I use mediatemple to host. I have a local copy of everything (I have a ubuntu install with RAID 5). I check everything into subversion on MT. I run a cron job to dump the database and check it into svn (nightly, weekly or monthly depending on how frequently the site is updated). I back everything up to DVDs once every 6 months and then store then in my personal safety deposit box at the bank.
I also have a development server (also raid 5) that tarballs it’s local svn copy every week. Just for extra piece of mind.
I also give all my clients ssh access and tarball their files/db nightly then tell them it is their responsibility to run their backups.
Given the fact that I’m not very much into Cron jobs, I’m using SiteVault 2.2 (http://www.site-vault.com ) that backups DB’s and FTP. Thanks to this thread, I just discovered that V.3 now supports Secure FTP. I was considering switching to Handy Backup (http://www.handybackup.net) but it is really more expensive and I’m not sure about the features. I will perhaps d’ld it to give it a try though because their last version is pretty recent (june).
Then obviously I backup the whole folder with Mozy. I do that for all my clients.
You might wanna take a look at this: http://davidwalsh.name/backup-mysql-database-php
There are a lot of great resources out there that can help you backup your website information. Just do your research to ensure that you get a good deal working with a reputable company.