In my previous post I showed my WordPress update script. However, it’s not safe to update without first backing everything up in case something goes wrong. This is a script that I adapted from this post. It backs up both files and the database.
#!/bin/bash echo "In $0" if [ $# -gt 0 ]; then NOW=$1 else NOW=$(date +"%Y-%m-%d-%H%M") fi FILE="maksle.com.$NOW.tar" BACKUP_DIR="/home/private/backups" WWW_DIR="/home/public/blog" DB_HOST="dbhost" DB_USER="backupUser" DB_PASS="backupUserPassword" DB_NAME="wp_db" DB_FILE="maksle.com.$NOW.sql" # WWW_TRANSFORM='s,^home/public/blog,www,' # DB_TRANSFORM='s,^home/private/backups,database,' WWW_TRANSFORM=',/home/public/blog,www,p' DB_TRANSFORM=',/home/private/backups,database,' # tar -cvf $BACKUP_DIR/$FILE --transform $WWW_TRANSFORM $WWW_DIR tar -cvf $BACKUP_DIR/$FILE -s $WWW_TRANSFORM $WWW_DIR mysqldump --host=$DB_HOST -u$DB_USER -p$DB_PASS $DB_NAME > $BACKUP_DIR/$DB_FILE # tar --append --file=$BACKUP_DIR/$FILE --transform $DB_TRANSFORM $BACKUP_DIR/$DB_FILE tar --append --file=$BACKUP_DIR/$FILE -s $DB_TRANSFORM $BACKUP_DIR/$DB_FILE rm $BACKUP_DIR/$DB_FILE gzip -9 $BACKUP_DIR/$FILE
You may have noticed that there is a commented out version of the tar transform variable and command. My host has a version of tar (bsdtar 2.8.5) that doesn’t have the --transform
option, but does have an alternative -s
option that does more or less the same thing. The idea is that the backup will have directory stucture backup/file.php
rather than /home/public/blog/file.php
for example.
mysqldump has many options you can pass it, which you may want to look into. However, the option --opt
is a default, and does what I want. It is probably good enough for most sites. The problem with --opt
is that it requires locking the table during the export, which also has implications on permissions required for your backup user. What backup user? Well, since you are storing the DB user and password in plain text in your script, you should not use your administrator user. It’s best to create a backup user with minimal permissions necessary to do the backup. Ideally that would be just SELECT privileges, but with the mentioned --opt
option, LOCK TABLES privileges are required too. Here’s how you’d set that user up:
MySQL> CREATE USER backup IDENTIFIED BY 'randompassword'; MySQL> GRANT SELECT ON *.* TO backup; MySQL> GRANT LOCK TABLES ON *.* TO backup;
I call the above script from a cron job on my local computer:
#!/bin/bash # Exit if any command fails set -e # Don't allow use of unintialized variables set -u # Set up some variables NOW=$(date +"%Y-%m-%d-%H%M") BACKUP_DIR="$HOME/Documents/backups" LOG_DIR="${BACKUP_DIR}/logs" LOG_FILE="maksle-backup-$NOW.log" # Redirect standard output and error output to a log file. exec > >(tee -a "${LOG_DIR}/${LOG_FILE}") exec 2> >(tee -a "${LOG_DIR}/${LOG_FILE}" >&2) mkdir -p $LOG_DIR cd $BACKUP_DIR # The cool part: Run my local wp-backup.sh on the remote web server. ssh maksle 'bash -s' < ~/bin/wp-backup.sh $NOW # Sync the remote server backup logs with the backups directory on my local machine. After all, what good are backups if your webserver is down and you can't access them? rsync -havz --stats maksle:/home/private/backups/ $BACKUP_DIR
Of course, the remote server can get filled up with backups, so I have another script that removes any backups more than 5 days old. I continue to have as many as far back as I want on my local machine.
#!/bin/bash set -e set -u # Error out if a command in a pipe fails set -o pipefail # Usage example: # wp-remove-old-backups.sh /home/private/backups 5 WORKING_DIR=$1 cd $WORKING_DIR # This would be 5 if called as in the Usage example declare -i allow=$2 # This gets the number of files in the directory, which we assume are all backup tgz files declare -i num=$(ls | wc -l) if [ $num -gt $allow ]; then # Remove all but latest files (ls -t | head -n $allow; ls) | sort | uniq -u | sed -e 's,.*,"&",g' | xargs rm -f fi
The above command works by first printing the latest 5 files, and then all the files. This way the latest 5 files get printed twice. This allows uniq -u
to filter out the latest 5, and the rest of the files get sent to their slaughter. The intermediate sed -e 's,.*,"&",g'
makes it work when there are spaces in the filenames by wrapping the filenames in quotes (avoid spaces in filenames).
Of course, I call this script via a local cron job as well.
#!/bin/bash BACKUP_DIR="$HOME/Documents/backups" LOG_DIR="${BACKUP_DIR}/logs" LOG_FILE="maksle-backup-cleanup-$NOW.log" exec > >(tee -a "${LOG_DIR}/${LOG_FILE}") exec 2> >(tee -a "${LOG_DIR}/${LOG_FILE}" >&2) ssh maksle 'bash -s' < ~/bin/wp-remove-old-backups.sh "/home/private/backups" 5
I hope that will help someone out!
Clever!