How to automate backups on my ftp server?
December 5, 2009 12:29 PM   Subscribe

I have some wordpress sites hosted at bluehost that I want to keep backed up. Can you help me automate this? Detailed needs inside.

What I've been doing is going to the bluehost control panel every few weeks and manually downloading, first, all the files on the ftp server. Then I'm separately downloading all the dbs. Then I'm throwing them all in a folder on my local machine and have Carbonite back them up remotely. I keep five backup sets, and delete the oldest when I add a new one.

I would like to automate this 100% if possible -- maybe have some kind of script that every week zips up the backups into a one big file on the ftp server, which I can then use SynchBack Pro to mirror to my local HD on a regular basis?

The solution would have to compress all the dbs and a defined portion of the ftp file structure into one file named 120509.* (itself inside a folder called /backups). The following week, same exact thing, except the files are compressed into 121209.*, etc. Each week it should look in /backups for a file named with a date older than 5 weeks and delete it.

Then I would use Synchback every week to pull down the contents of /backups on the ftp server to my local drive, which I will tell Carbonite to pull up remotely.

What do I do to automate the compressing and saving and sequential naming and date-based deleting on the ftp server?

I've never used cron or SSH and only done super basic stuff using phpmysql. I've read through these two askmes and while they did include code snippets that seemed like what I needed, I don't know what to do with them or how to modify them to do what I specifically need. So speak very slowly. Thanks!
posted by stupidsexyFlanders to Computers & Internet (7 answers total) 3 users marked this as a favorite
Wordpress consists of three things: a MySQL database with all the textual content, a theme with templates for its layout, and a collections of scripts underneath together called Wordpress.

The database needs to be backupped regularly, and there are several plugins for this. Easy to find through Google. They automate anything that's needed, including making a daily backup, either compressed or not. I let mine send a daily backup to a special Gmail-account.

The theme, which is the collection of php-templates and stylesheets that give your site its look, only needs to be back-upped once it's changed. Which won't be every day. Wordpress makes dynamic sites, the texts are called from the database. It is good to have a clean backup of your theme ready, though. Wordpress sites can be defaced, and often the theme will be corrupted.

Wordpress itself can be downloaded from at any time, and thus doesn't need to be back-upped by you.

So, they only thing that falls outside of this, is the media, like pictures, you put online. However, in order to put those online, you'll need the 'originals' anyway.

Organize this.
posted by ijsbrand at 1:49 PM on December 5, 2009

I understand what you're saying, but I don't want five different backup strategies for all the types of WP content. I want to automate everything in one process so in case of difficulties, it's a straight upload to revert to the backup dbs, WP files, theme, media, etc. including all customizations.
posted by stupidsexyFlanders at 2:10 PM on December 5, 2009

It should be possible to automate all of this using something like AutoIt [Windows], but you'd still have to have several programmes run sequentially. If you can code, it's easy to write the script that will do that. But there's no off the shelf programme that will let you do this.

What I would do is this: run my FTP client, and use AutoIt to make it download all of the files on the FTP server to a folder. Then I'd use AutoIt to log in to my blog and make it run the function from this plugin to create a backup, and set it to download that into the folder. Then I'd make Winrar compress that folder, using AutoIt to tell it the name of the file based on the date. Then I'd upload that to wherever.

I have to ask, though. Is there some reason you're downloading the Wordpress software in it's entirety every week? Are you making some serious changes to core files on a regular basis? If you're just fiddling with the theme, that's all you need to back up. And if you are editing core files, you'll be better served with a local mysql server and Apache to test it out on, rather than uploading it and making it live.
posted by Solomon at 2:27 PM on December 5, 2009

no, I'm not editing core files, I download everything because it's easier than figuring out what to keep, and the restore is simpler too. Bandwidth and storage are not issues.

My main concern is keep it as simple as possible.
posted by stupidsexyFlanders at 3:28 PM on December 5, 2009

Installing a plugin to download the database for you automatically on every five websites is as simple as it get. There really is no need to back up every file every time; that's a waste of bandwidth. In your stubbornness to keep things simple you are overlooking that it is you who might be complicating matters.
posted by ijsbrand at 4:01 PM on December 5, 2009

I use a crontab that mysdumps my DB, tars my files and does an svn up with the whole mess, so I not only have a backup of the latest version of the site, but of all versions.
posted by signal at 5:14 PM on December 5, 2009

mysdump > mysqldump
posted by signal at 5:14 PM on December 5, 2009

« Older Requesting an old set list.   |   What do I want to do to this picture? Newer »
This thread is closed to new comments.