Database backup

Image by C M on Unsplash

Image by C M on Unsplash

Databases are used to store all sorts of data. Most websites like www.haxor.no run on a CMS (Content Management System) that is used to manage the content of the website. That content is used by a web application to present nice dynamic web pages based on that content.

In this article, I will present you with a simple method of backing up databases on a backup server every night for safekeeping.

Backup

Why? how often?

Databases are subjected to unintentional deletion in addition to data corruption, ransomware, and other methods of data loss.

On websites, the content of databases can perhaps be the most valuable part of the website. Countless hours are spent creating content and optimizing the content on the website, so in the case of data loss, the consequences can be huge for the website owner.

Because of this, it's a good idea to make at least daily backups of your website's database. If someone accidentally deletes content or something worse happens you can easily get it back through a backup.

There are many philosophies when it comes to how backups are supposed to be stored. For simplicities sakes, I take a backup every night and store them for as long as I want to.

If I decide to reduce the number of backups I store this can be handled by a separate cleanup script on the backup server. By adhering to a strict naming convention on the backup files this can easily be automated.

Script

Automate the process

The script is based on the one I wrote in the Norwegian article Backup with Linux. Please note that there are a lot of automated tools out there to backup many databases and websites in one go. for me, this script works just fine. In addition, I can easily modify it to backup other stuff.

vim /home/backup_user/scripts/backup_DB_www.haxor.no.sh
backup_DB_www.haxor.no.sh
#!/bin/bash
#
# Originally written by Stanley Skarshaug 18.01.2022
# Script intended to be run once a day, to backup database on a remote host
#
# Change these five variables to fit your need
#

remoteDir=/home/backup_user/backup
backupName=DB.www.haxor.no
remoteHost=haxor.no
remoteUser=backup_user
localPath=/home/backup_user/backup/www.haxor.no/
dbName=databasename
dbUser=dbusername
dbPass=dbpassword

dateString=`date +'%Y.%U.%w'`
weekString=`date +'%Y.%U'`

destinationFile=${remoteDir}/${backupName}.${dateString}.gz

infoColor='\033[0;32m'
noColor='\033[0m'
infoBox="${infoColor}[INFO]${noColor}"

# Create backup on the remote host
printf "${infoBox} Creating a backup on the remote host... "
ssh ${remoteUser}@${remoteHost} "mysqldump -u ${dbUser} -p${dbPass} ${dbName} | gzip > ${destinationFile} 2>/dev/null"
printf "   DONE \n"

# Check if the local backup directory exists, if not create it.
if  [[ ! -d ${localPath} ]]; then
	printf "${infoBox}   Local backup folder was not found, creating it now. \n"
	mkdir -p ${localPath}
fi

# Transfer backup to backup server
printf "${infoBox} RSYNC backup... "
rsync ${remoteUser}@${remoteHost}:${destinationFile} ${localPath}
printf "   DONE \n"

# Cleanup. Delete the backup from the remote host
printf "${infoBox} Cleanup: Remove backup file on remote host..."
ssh ${remoteUser}@${remoteHost} "rm ${destinationFile}"
printf "   DONE \n"

printf "${infoBox} FINISHED - Backup job complete!\n"

All this script does is to make a backup of the database with mysqldump on the webserver, pipe the output from that process into gzip and store the compressed backup as a file on the webserver. It then rsyncs the database backup to our backup server. Once the backup is safe on the backup server the original backup on the webserver is deleted.

Please note that the DB password will be stored in this script. Set file permissions on the script to reflect that risk. in my case I do not worry because the backup server only has one user and that is the backup user.

To run the script you have to add execute permissions to your own user

chmod u+x /home/backup_user/scripts/backup_DB_www.haxor.no.sh

Test the script by running it manually the first time.

/home/backup_user/scripts/backup_DB_www.haxor.no.sh

Schedule

Backup every night

To make the backup process automated you should have a couple of prerequisites ready. First, you will need a passwordless SSH keypair ready. This can be created by running this command

ssh-keygen

Then to transfer the public key to the webserver run this command

ssh-copy-id backup_user@www.haxor.no

Thest the backup script one more time. Now you should not be prompted for a password on the webserver.

If all goes well you are now ready to add the script to your crontab.

crontab -e
/etc/crontab
...
30 2 * * 0-6 /home/backup_user/scripts/backup_DB_www.haxor.no.sh

This will run the backup script at 02:30 every night.