Backing up a PHP MySQL site regularly on Ubuntu

There are already a number of examples out there on how to do this, but here is a complete recipe on how to do it.

I’ll show you how to:

  • Export all MySQL database tables with a given prefix.
  • Save all files in the public directory, and the MySQL dump into a compressed archive file.
  • Use crontab for regular backup.

The script for doing all this can be found in the end of the post.

Exporting MySQL databases with a given prefix

To export a MySQL database, the command mysqldump is a handy tool. It takes username, password, and database name as arguments and outputs SQL code that can be run to restore the current database. To export specific tables, one can provide a list of table names.

What I want to do is to export tables with a given prefix. For example, all tables that has a name beginning with myPrefix_. This cannot be done directly using mysqldump, so let’s get our custom table list using the mysql command.

mysql -u myUserName --password=myPassword -e "SHOW TABLES;" myDatabaseName | grep -e myPrefix_'.\+'

Running that command will hopefully list the tables you need. To pass this list to mysqldump, we first save it to a variable named TABLES.

# Get table names
TABLES=$(mysql -u myUserName --password=myPassword -e "SHOW TABLES;" myDatabaseName | grep -e myPrefix_'.\+');
# Dump tables to dump.sql
mysqldump -u myUserName --password=myPassword myDatabaseName $TABLES > dump.sql;

Now you’ve dumped the tables you need to dump.sql.

Compress all files

A .tar.gz file with all the site files and database backup would be nice, aye? Let’s do that.

The commands tar and gzip can help us bundle the files and compress them. (Actually, we can bundle AND compress using tar, but since we first need to add the public files and then append the database file to the archive, we must bundle and compress separately).
To add all files in a folder to a .tar archive, run:

tar -cf target.tar /home/steffe/public_html;

To add a file to the archive (in our case, the database dump), run this:

tar -rf target.tar dump.sql;

To compress the archive, and produce target.tar.gz, run

gzip target.tar;

Regularly execute a command

To make backups regularly, we will use crontab. It is really easy to use. To add a task, run

crontab -e

This will start the edit mode of crontab. An editor is launched, and you are expected to edit it and then close the editor.

For example, if we want to run the command updatedb at five a.m. every monday, we add the line

0 5 * * 1 updatedb

The format is “Minute Hour DayOfMonth Month DayOfWeek Command”. You can add any command with any flags or arguments. Detailed instructions on how to use crontab can be found here.

Wrapping it all together

This is the script I ended up with, it combines all things above. It also adds a timestamp to the backupfiles’ name.

#!/bin/bash

BACKUP_DIR="/home/steffe/backup"
DIR_TO_BACKUP="/home/steffe/public_html"
DB_USER="steffe"
DB_PASS="noyoudont"
PREFIX="wp_"
DB_NAME="database1"

echo "Getting table names..."; TABLES=$(mysql -u $DB_USER --password=$DB_PASS -e "SHOW TABLES;" $DB_NAME | grep -e $PREFIX'.\+');
DB_DUMP=/tmp/db.sql;
echo "Dumping database to $DB_DUMP...";
mysqldump -u $DB_USER --password=$DB_PASS $DB_NAME $TABLES > $DB_DUMP;

DATE=$(date +"%0Y%0m%0d_%0H%M%0S");
TARGET_FILE=$BACKUP_DIR/backup.$DATE.tar.gz;
TEMP_FILE=$BACKUP_DIR/backup.$DATE.tar;

echo "Wrapping it all into $TEMP_FILE...";
tar -cf $TEMP_FILE $DIR_TO_BACKUP;

echo "Adding database data to archive..."
tar -rf $TEMP_FILE $DB_DUMP;
rm $DB_DUMP;

echo "Compressing..."
gzip $TEMP_FILE;

echo "Backup done: $TARGET_FILE";

To get started, save the code above to a file, e.g. /home/username/backup.sh. Edit the MySQL login details and folder names at top. Make it executable by running chmod +x /home/username/backup.sh. Then add the line “0 5 * * 1 /home/username/backup.sh” to crontab.

Good luck!

Leave a Reply