I have some wordpress sites hosted at bluehost that I want to keep backed up. Can you help me automate this? Detailed needs inside.
What I've been doing is going to the bluehost control panel every few weeks and manually downloading, first, all the files on the ftp server. Then I'm separately downloading all the dbs. Then I'm throwing them all in a folder on my local machine and have Carbonite back them up remotely. I keep five backup sets, and delete the oldest when I add a new one.
I would like to automate this 100% if possible -- maybe have some kind of script that every week zips up the backups into a one big file on the ftp server, which I can then use SynchBack Pro to mirror to my local HD on a regular basis?
The solution would have to compress all the dbs and a defined portion of the ftp file structure into one file named 120509.* (itself inside a folder called /backups). The following week, same exact thing, except the files are compressed into 121209.*, etc. Each week it should look in /backups for a file named with a date older than 5 weeks and delete it.
Then I would use Synchback every week to pull down the contents of /backups on the ftp server to my local drive, which I will tell Carbonite to pull up remotely.
What do I do to automate the compressing and saving and sequential naming and date-based deleting on the ftp server?
I've never used cron or SSH and only done super basic stuff using phpmysql. I've read through these two
askmes and while they did include code snippets that seemed like what I needed, I don't know what to do with them or how to modify them to do what I specifically need. So speak very slowly. Thanks!