Backing up MySQL to a remote site

I now run two webservers in different locations. The basic platform is Apache with Joomla on top for a CMS and a splattering of wordpress and roundcube for good measure. I needed to do a couple of things though – backup the databases to another location and ensure static image content was mirrored between the two sites. I didn’t want to necessarily keep the MySQL servers in sync so much as have a backup of the CMS content. I intend to import old databse copies as and when I need to. Here is how I did it:

(replace theuser with the real username, thepassword with the real password and x.x.x.x with your real destination FTP server).

lftp -c ‘open -e “mirror /otndocs/images/ /var/www/otndocs/images/” -d -u theuser, thepassword X.X.X.X

lftp -c ‘open -e “mirror /otndocs/templates/rhuk_milkyway/ /var/www/otndocs/ templates/rhuk_milkyway/” -d -u theuser,thepassword X.X.X.X

The above script uses lftp to mirror two directories. The different paths reflect how the target FTP server holds content in different folders to the source webserver. I have to be picky wth what I transfer because the target server is a WindowsXP machine and the source runs Ubuntu Server. If both machines matched paths and underlying O/S, you should pretty much be able to just dump the whole server over. That is a bit risky as you will carry any malicious code over as well. I stuck with just images for the moment, though the second line shows how to copy a specific template as well.

Here is the script to backup the mySQL database. It dumps, gzips and uploads the files one by one in a target folder with today’s date on it. I’ve added my crontab entry at the end to show how you could, like me schedule it to run at 3am for example. Again, replace the bits in blue with your usernames etc.

#!/bin/sh

# List all of the MySQL databases that you want to backup in here,

# each separated by a space

databases="mysql otndata wordpress"
# Directory where you want the backup files to be placed

backupdir=/backup

# MySQL dump command, use the full path name here
mysqldumpcmd=/usr/bin/mysqldump

# MySQL Username and password
userpassword=" --user=themysqluser --password=themysqlpassword"

# MySQL dump options
dumpoptions=" --quick --add-drop-table --add-locks --extended-insert --lock-tables"

# Unix Commands
gzip=/bin/gzip
uuencode=/usr/bin/uuencode
mail=/bin/mail

# Send Backup?  Would you like the backup emailed to you?
# Set to "y" if you do
sendbackup="n"
subject="Your MySQL Backup"
mailto="person@mailbox.org"

# Create our backup directory if not already there
mkdir -p ${backupdir}
if [ ! -d ${backupdir} ]
then
echo "Not a directory: ${backupdir}"
exit 1
fi

# Dump all of our databases
echo "Dumping MySQL Databases"
for database in $databases
do
$mysqldumpcmd $userpassword $dumpoptions $database > ${backupdir}/${database}.sql
done

# Compress all of our backup files
echo "Compressing Dump Files"
for database in $databases
do
rm -f ${backupdir}/${database}.sql.gz
$gzip ${backupdir}/${database}.sql
done

# Send the backups via email
if [ $sendbackup = "y" ]
then
for database in $databases
do
$uuencode ${backupdir}/${database}.sql.gz > ${backupdir}/${database}.sql.gz.uu
$mail -s "$subject : $database" $mailto < ${backupdir}/${database}.sql.gz.uu
done
fi

# And we're done
ls -l ${backupdir}
echo "Dump Complete!"

HOST='X.X.X.X'
USER='theuser'
PASSWD='thepassword'
DATE=$(date +%d%m%y)

#Our local directory containing files to FTP
cd ${backupdir}

# Process each file in directory, ftping each one

for filename in ${backupdir}/*; do
fn=$(basename "$filename")
echo "$fn"

ftp -n -v <<-EOF
open ${HOST}
user ${USER} ${PASSWD}
cd /
mkdir $DATE
cd $DATE
pwd
put $fn
bye
EOF
echo "FTP upload attempt for $fn complete"

done

echo "FTP complete!"
exit

type crontab-e to add your scripts to the schedule:

0 3 * * * /home/simkin/backup-databases