The restore did more damage than the crash...
April 10, 2009 6:30 AM Subscribe
Looking to automate the backup and download of mySQL tables on a cPanel webhost to an external FTP site.
I have an account with a web hosting provider that is running cPanel. I also have a local FTP server in my home (an old G4 Mac running Fedora) that works nicely. I already have a PHP script in place that runs via CRON job, and once a week, it backs up the ENTIRE cPanel account and FTPs it down to my server here, so I have an offsite copy no more than a week old at all times.
Recently, an issue on the server with another user's website spilled over and caused mySQL to crash hard, and corrupted one of my databases. Of course, this happened the night before the backup was due to kick off, so restoring would have rolled us back a week, and the forums on my website would have lost a LOT of important posts that were made in that time. My provider was able to restore the DB from 24 hours before the crash, but in doing so, they also restored all of my other DBs, causing a minor loss of data (again, at a critical time).
Is there a way to script a command, such as using PHP, that will dump, gzip, and FTP down a specified database when run, which I can then set up via CRON job to run daily? Backing up the site daily is not an option, it's almost 2 Gigs in size due to photo content. Or, is there another solution to back up the DBs automatically that I am missing? The applications are Joomla 1.0.15, SMF 1.1.8, and Gallery2.3 if that helps.
I have an account with a web hosting provider that is running cPanel. I also have a local FTP server in my home (an old G4 Mac running Fedora) that works nicely. I already have a PHP script in place that runs via CRON job, and once a week, it backs up the ENTIRE cPanel account and FTPs it down to my server here, so I have an offsite copy no more than a week old at all times.
Recently, an issue on the server with another user's website spilled over and caused mySQL to crash hard, and corrupted one of my databases. Of course, this happened the night before the backup was due to kick off, so restoring would have rolled us back a week, and the forums on my website would have lost a LOT of important posts that were made in that time. My provider was able to restore the DB from 24 hours before the crash, but in doing so, they also restored all of my other DBs, causing a minor loss of data (again, at a critical time).
Is there a way to script a command, such as using PHP, that will dump, gzip, and FTP down a specified database when run, which I can then set up via CRON job to run daily? Backing up the site daily is not an option, it's almost 2 Gigs in size due to photo content. Or, is there another solution to back up the DBs automatically that I am missing? The applications are Joomla 1.0.15, SMF 1.1.8, and Gallery2.3 if that helps.
I run this script every four hours to back up my databases (i anonymized it a bit. trade secrets!):
For most of this, i used jwz's awesome tutorial on how to automate rsync and ssh.
posted by Mach5 at 9:50 AM on April 10, 2009
#!/bin/bash SSH=/usr/bin/ssh KEY=/home/mach5/.ssh/sckey RUSER=login RHOST=mydomain.com DATE=`date +"%F-%R"` MPATH=/home/mach5/shuttercal/scdb/ MDUMP=mysqldump PRUNE=/home/mach5/cron/fileprune PORT=22 $SSH -i $KEY -p $PORT $RUSER@$RHOST $MDUMP db_schema > $MPATH$DATE $PRUNE -e 1.3 -s 5G $MPATH*so, running through this, i set up ssh keys between mydomain and mybackupserver. this script is cronned on mybackupserver, and runs mysqldump remotely over ssh. the dump is then captured in a file named for the date (ex: 2009-04-10-08:00). after this, i run a perl script i found called fileprune which deletes files selectively, in my case, the dir maxes out at 5 gigs, and deletes files on an exponential curve. I also have, on mydomain, a script to validate if a server other than the one allowed is trying to use my keys.
For most of this, i used jwz's awesome tutorial on how to automate rsync and ssh.
posted by Mach5 at 9:50 AM on April 10, 2009
What I've been doing more and more of is running rsnapshot. It can do multiple backups of files a day without using a lot of space or bandwidth because it only copies the new and changed files. For the database backup, I'd write a little shell script to evoke mysqldump over ssh, compress the output on the remote machine, and pipe it to a local file that rsnapshot manages. All it has to do is run the command below. It was easier to put it in a script though and just run the script from rsnapshot because then I didn't have to fuss as much about how the arguments were interpreted.
(substitute the appropriate values for the text in ALLCAPS)
This is run by rsnapshot by adding this to the rsnapshot.config (all one line, make sure you have tabs in the right places:
posted by Good Brain at 9:50 AM on April 10, 2009
#!/bin/bash
# /usr/local/bin/backup-remote-mysql
ssh REMOTEUNIXUSER@TARGET_HOST "mysqldump -uMYSQLUSER -pMYSQLPASSWORD TARGET_DB |gzip -f" > TARGET_DB-dump.gz
(substitute the appropriate values for the text in ALLCAPS)
This is run by rsnapshot by adding this to the rsnapshot.config (all one line, make sure you have tabs in the right places:
backup_script /usr/local/bin/backup-remote-mysql TARGET_HOST/backup/mysql/
posted by Good Brain at 9:50 AM on April 10, 2009
Mach5's approach is similar to mine, though I rely on rsnapshot to do things like rotate the files (and take care of file backups). I left out something he alludes to, and that I think is covered in the page he links to and the rsnapshot documentation, which is that you need to set up passwordless certificate based login over ssh to the remote server. Also, maybe there is more I need to learn about mysqlauthentication, I couldn't figure out how to get it to work off the credential of the user running mysqldump.
posted by Good Brain at 9:55 AM on April 10, 2009
posted by Good Brain at 9:55 AM on April 10, 2009
Response by poster: Thank you all for the help - I'll do some reading up on the links provided, and let you know what solution I went with.
posted by GJSchaller at 6:40 AM on April 14, 2009
posted by GJSchaller at 6:40 AM on April 14, 2009
Best answer: I wound up using this script I found online:
http://www.ameir.net/blog/index.php?/archives/18-MySQL-Backup-to-FTP-and-Email-Shell-Script-for-Cron-v2.1.html
posted by GJSchaller at 11:49 AM on March 11, 2010
http://www.ameir.net/blog/index.php?/archives/18-MySQL-Backup-to-FTP-and-Email-Shell-Script-for-Cron-v2.1.html
posted by GJSchaller at 11:49 AM on March 11, 2010
This thread is closed to new comments.
You should be able to use cron, mysqldump, gzip, and scp to do everything you need.
posted by christonabike at 7:34 AM on April 10, 2009