Automatically backup your website to Dropbox
Until recently I have been periodically taking manual backups of the content on my website. However given that life has been extremely busy in recent times, I have often missed my backup schedule. That I had to login onto my CPanel to take the backups, then manually download them - all time consuming activities - did not help. I kept deferring my backups reasoning to myself that I hadn't made any recent updates in any case.
One day however, I found that some comments that had been left on my blogs were missing. I tried to investigate but couldn't figure out what had happened. I then encountered some missing files - again without explanation. Then my website just rolled over and died! Every time I got unsatisfactory explanations from my webhost: a planned server migraton had been cancelled; they had performed some security updates at the server level; etc. All the time they assured me that my content should be intact - well, I felt different.
That was when I decided that an Automated Backup solution was a necessity. I needed something that had minimal necessity of manual intervention.
I started with a search on Google and found various scriptlets that would help. The most promising was one I found at LifeHacker
I used the script they had provided on that article as a base and decided to build on it to make it more configurable.
The script I ended up building was:
Script: databackup.sh
-
#!/bin/sh
-
#
-
# Script: databackup.sh
-
# Version: 1.2
-
#
-
# Purpose:
-
# Perform an Automatic Backup based on inputs provided in a Config file.
-
#
-
# License: LGPLv2
-
# Author: Kiran J. Holla
-
# http://www.kiranjholla.com/
-
#
-
# Description:
-
# This script provides a utility that can be used to automatically backup various
-
# elements of a website. Individual TAR files are created as per configuration and
-
# then optionally uploaded to Dropbox.
-
#
-
# The upload to Dropbox functionality makes use of the script provided by Andrea Fabrizi
-
# at http://www.andreafabrizi.it/?dropbox_uploader
-
#
-
#
-
# Modification Log:
-
# v1.0 Nov 25, 2012 Initial version
-
#
-
# v1.1 Jan 14, 2013 Fixed some bugs where the usage was getting printed multiple
-
# times and the number of days for the directory backup was
-
# yielding negative numbers when the year changes.
-
#
-
# v1.2 Mar 16, 2013 Included the --create-options option in the mysqldump command
-
# to ensure that the auto increment parameter is not skipped in
-
# dump file.
-
#
-
-
print_usage()
-
{
-
# Function to print the Usage instructions for this script.
-
#
-
echo "Usage:"
-
echo " databackup.sh <Backup Name> <Param 1> [<Param 2> [<Param 3> [. . .]]]"
-
echo " "
-
echo " Backup Name: The name by which the consolidated backup should be named. A file by name"
-
echo " BACKUP_<Backup Name>_<Date & Time>.tar is created."
-
echo " "
-
echo " Parameters: Each parameter marks a set of Backup instructions that need to be documented"
-
echo " in the .databackup_config file that should exist in the same directory as this"
-
echo " script."
-
echo " "
-
echo "Sample Configuration File Contents:"
-
echo " PARAM1:BKPTYPE:DIR"
-
echo " PARAM1:BKPFORI:FULL"
-
echo " PARAM1:BKPATTR1:WebFiles"
-
echo " PARAM1:BKPATTR2:/home/user/httpd/www"
-
echo " PARAM1:BKPATTR3:/home/user/backups"
-
echo " "
-
echo " PARAM2:BKPTYPE:DB"
-
echo " PARAM2:BKPFORI:FULL"
-
echo " PARAM2:BKPATTR1:WebDatabase"
-
echo " PARAM2:BKPATTR2:my_database"
-
echo " PARAM2:BKPATTR3:/home/user/backups"
-
echo " "
-
echo "Then, run the script as below:"
-
echo " databackup.sh WebBackup PARAM1 PARAM2"
-
echo " "
-
-
}
-
-
backup_db()
-
{
-
# Function to dump a MYSQL database and then package the resulting
-
# SQL file in a tar.
-
#
-
# This function assumes that the configuration parameter ${DB_BACKUP_USER}
-
# is set to any user name that possesses sufficient privileges on the DB
-
# being backed up using mysqldump.
-
#
-
# This function further assumes that a .my.cnf file has been created and
-
# placed in the home directory of the user running this script and that
-
# file contains the correct password for the user name being used.
-
#
-
-
-
DB_BACKUP_USER=`grep 'DB_BACKUP_USER' ${CNFGFILE} | cut -d":" -f2`
-
DB_HOST=`grep 'DB_HOST' ${CNFGFILE} | cut -d":" -f2`
-
-
echo "Dumping Database " >> ${MAILFILE}
-
echo " " >> ${MAILFILE}
-
-
mysqldump -u ${DB_BACKUP_USER} -h ${DB_HOST} --skip-opt --add-drop-table --create-options --complete-insert --extended-insert --single-transaction --result-file="${BKPATTR3}/BKPFULL_${BKPTYPE}_${BKPATTR1}_${TMTODAY}.sql" ${BKPATTR2}
-
-
cd ${BKPATTR3}
-
tar --create --verbose --file "BKPFULL_${BKPTYPE}_${BKPATTR1}_${TMTODAY}.tar" "BKPFULL_${BKPTYPE}_${BKPATTR1}_${TMTODAY}.sql"
-
-
echo " " >> ${MAILFILE}
-
-
if [ -f "${BKPATTR3}/BKPFULL_${BKPTYPE}_${BKPATTR1}_${TMTODAY}.tar" ]
-
then
-
echo "Database backed up to ${BKPATTR3}/BKP${BKPTYPE}_${BKPATTR1}_${TMTODAY}.tar" >> ${MAILFILE}
-
-
rm "${BKPATTR3}/BKPFULL_${BKPTYPE}_${BKPATTR1}_${TMTODAY}.sql"
-
return 0
-
else
-
echo "Error in Database backup." >> ${MAILFILE}
-
return 1
-
fi
-
-
}
-
-
backup_dir()
-
{
-
# Function to backup a physical directory. There are two options;
-
#
-
# 1: Full Backup - this option creates a full backup for the
-
# entire directory including any subdirectories within it.
-
#
-
# 2: Incremental Backup - this option takes a backup of only
-
# those files that were modified after the LASTRUN date
-
#
-
-
echo "Backing up directory ${BKPATTR2}" >> ${MAILFILE}
-
BKPOK=1
-
-
LASTRUN=`grep ${BKPMODE} ${CNFGFILE} | grep 'LASTRUN' | cut -d":" -f3` #Last Run Date in YYYYMMDD
-
-
if [ "${BKPFORI}" = "FULL" ] || [ -z ${LASTRUN} ]
-
then
-
# Perform Full Backup
-
-
tar --create --verbose --recursion --file ${BKPATTR3}/BKP${BKPFORI}_${BKPTYPE}_${BKPATTR1}_${TMTODAY}.tar ${BKPATTR2}
-
-
BKPOK=$?
-
echo "Tar Return Status ${BKPOK}" >> ${MAILFILE}
-
-
else
-
# Perform Incremental Backup
-
-
echo "Backup was last run on ${LASTRUN}" >> ${MAILFILE}
-
DAYSLAST=$((($(date -u -d "${DTTODAY}" +%s) - $(date -u -d "${LASTRUN}" +%s)) / 86400))
-
-
echo "Days since last run ${DAYSLAST}" >> ${MAILFILE}
-
-
-
#Create an empty TAR file, which can then be used to add the modified files
-
tar --create --verbose --file ${BKPATTR3}/BKP${BKPFORI}_${BKPTYPE}_${BKPATTR1}_${TMTODAY}.tar
-
find ${BKPATTR2} -type f -mtime -${DAYSLAST} -exec tar --append --verbose --dereference --file ${BKPATTR3}/BKP${BKPFORI}_${BKPTYPE}_${BKPATTR1}_${TMTODAY}.tar "{}" ;
-
-
BKPOK=$?
-
echo "Tar Return Status ${BKPOK}" >> ${MAILFILE}
-
-
fi
-
-
if [ ${BKPOK} -eq 0 ]
-
then
-
-
echo " " >> ${MAILFILE}
-
echo "TAR created." >> ${MAILFILE}
-
-
#Backup has been successful
-
#Replace the LASTRUN date in Config file with today's date
-
cat ${CNFGFILE} | grep -v "${BKPMODE}:LASTRUN" > ${CNFGTEMP}
-
echo "${BKPMODE}:LASTRUN:${DTTODAY}" >> ${CNFGTEMP}
-
cat ${CNFGTEMP} > ${CNFGFILE}
-
-
rm -f ${CNFGTEMP}
-
-
echo "Backed up the directory." >> ${MAILFILE}
-
return 0
-
else
-
echo "Error while backing up directory!" >> ${MAILFILE}
-
return 1
-
fi
-
}
-
-
export CNFGFILE
-
export CNFGTEMP
-
export MAILFILE
-
export TMTODAY
-
export DTTODAY
-
export BKPFORI
-
export BKPATTR1
-
export BKPATTR2
-
export BKPATTR3
-
export BKPMODE
-
-
SCRPTDIR=`dirname $0`
-
CNFGFILE=${SCRPTDIR}/.databackup_config
-
CNFGTEMP=${SCRPTDIR}/.databackup_conftemp
-
MAILFILE=${SCRPTDIR}/mailfile_temp.txt
-
MAILADDR=`grep 'BKP_MAIL_RECIPIENT' ${CNFGFILE} | cut -d":" -f2`
-
-
-
trap 'cat ${MAILFILE} | mail -s "Backup Error!" ${MAILADDR}; rm -f ${MAILFILE}; rm -f ${CNFGTEMP}; exit' 1 2 3 15
-
-
TMTODAY=`date +%Y%m%d%H%M%S`
-
DTTODAY=`date +%Y%m%d`
-
-
-
-
# The name that is to be used to create the consolidated backup file
-
if [ "$1" ]
-
then
-
BKPNAME=${1}
-
shift
-
else
-
print_usage >> ${MAILFILE}
-
exit 1
-
fi
-
-
if [ "$1" ]
-
then
-
echo "Running Backup at ${TMTODAY} for ${BKPNAME}" > ${MAILFILE}
-
echo " " >> ${MAILFILE}
-
-
while [ "$1" ]
-
do
-
-
echo "~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~" >> ${MAILFILE}
-
echo "Running for $1 . . ." >> ${MAILFILE}
-
-
BKPMODE=${1}
-
BKPTYPE=`grep ${BKPMODE} ${CNFGFILE} | grep 'BKPTYPE' | cut -d":" -f3` #DB or DIR
-
BKPFORI=`grep ${BKPMODE} ${CNFGFILE} | grep 'BKPFORI' | cut -d":" -f3` #Full or Incremental
-
BKPATTR1=`grep ${BKPMODE} ${CNFGFILE} | grep 'BKPATTR1' | cut -d":" -f3` #Site Name
-
BKPATTR2=`grep ${BKPMODE} ${CNFGFILE} | grep 'BKPATTR2' | cut -d":" -f3` #DB or DIR
-
BKPATTR3=`grep ${BKPMODE} ${CNFGFILE} | grep 'BKPATTR3' | cut -d":" -f3` #Backup Directory
-
-
echo "BKPTYPE = ${BKPTYPE}" >> ${MAILFILE}
-
echo "BKPFORI = ${BKPFORI}" >> ${MAILFILE}
-
echo "BKPATTR1 = ${BKPATTR1}" >> ${MAILFILE}
-
echo "BKPATTR2 = ${BKPATTR2}" >> ${MAILFILE}
-
echo "BKPATTR3 = ${BKPATTR3}" >> ${MAILFILE}
-
-
echo " " >> ${MAILFILE}
-
-
if [ "$BKPTYPE" ]
-
then
-
case $BKPTYPE
-
in
-
"DB") backup_db ;;
-
"DIR") backup_dir ;;
-
esac
-
fi
-
-
shift
-
done
-
-
echo "Creating Consolidated Backup TAR . . ." >> ${MAILFILE}
-
-
cd ${BKPATTR3}
-
tar --create --verbose --file BACKUP_${BKPNAME}_${TMTODAY}.tar BKP*${TMTODAY}.tar
-
-
if [ -f ${BKPATTR3}/BACKUP_${BKPNAME}_${TMTODAY}.tar ]
-
then
-
echo "Consolidated Backup TAR created successfully!" >> ${MAILFILE}
-
echo " " >> ${MAILFILE}
-
echo "Removing intermediate backups . . ." >> ${MAILFILE}
-
-
rm ${BKPATTR3}/BKP*${TMTODAY}.tar >> ${MAILFILE}
-
-
cd ${SCRPTDIR}
-
-
if [ -x ./dropbox_uploader.sh ]
-
then
-
echo "Backing up ${BKPATTR3}/BACKUP_${BKPNAME}_${TMTODAY}.tar to Dropbox. . ." >> ${MAILFILE}
-
./dropbox_uploader.sh upload ${BKPATTR3}/BACKUP_${BKPNAME}_${TMTODAY}.tar BACKUP_${BKPNAME}_${TMTODAY}.tar >> ${MAILFILE}
-
if [ $(grep -c 'DONE' ${MAILFILE}) -ne 0 ]
-
then
-
rm ${BKPATTR3}/BACKUP_${BKPNAME}_${TMTODAY}.tar
-
echo "Removed Consolidated TAR after upload to Dropbox" >> ${MAILFILE}
-
fi
-
else
-
echo "Dropbox uploader not found. Skipping Dropbox backup!" >> ${MAILFILE}
-
fi
-
fi
-
else
-
print_usage >> ${MAILFILE}
-
exit 1
-
fi
-
-
cat ${MAILFILE} | mail -s "Backup Completed!" ${MAILADDR}
-
-
rm -f ${MAILFILE}
-
rm -f ${CNFGTEMP}
This script is capable of performing the following types of backups of a Linux-MySQL based website:
- Full Directory Backup
- Incremental Directory Backup
- Full Database Backup
You can use the script to perform multiple such backups and package all of them together into a consolidated TAR file. This consolidated backup can be uploaded to Dropbox using the dropbox_uploader.sh provided by Andrea Fabrizi.
To setup the automated backup using this script, simply follow the steps below:
1. Download the script and sample configuration file
This script and the associated configuration file can be downloaded from here. Once you have the script and associated configuration file, upload them on to a directory on your server. Don't forget to make the file databackup.sh executable.
2. Setup the databackup configuration file
You can setup the script to run multiple types of backups. Each backup needs to be identified by a Backup Parameter. This Backup Parameter can be any random string that you choose. It is only used to identify the group of configuration parameters that are associated with the particular backup you want the script to execute. The Backup Parameters need to be sent to the script as arguments on the command line.
For each Backup Parameter the script looks for the following Configuration Parameters:
- BKPTYPE - The backup type; it could be a directory or a database (DIR or DB)
- BKPFORI - Full backup or Incremental (FULL or INCR). This configuration parameters is irrelevant and is ignored for database backups; it is considered only when the BKPTYPE is DIR
- BKPATTR1 - the individual backup name; this is included in the individual TAR files to help you distinguish what files are in the TAR
- BKPATTR2 - The backup source. In case of a DIR backup, this would be the directory that needs to be backed up. In case of a DB backup, this would be the database that needs to be backed up.
- BKPATTR3 - The backup destination. The directory where the consolidated backup TAR files should be stored.
In addition, you will need to also setup a couple of generic parameters within the configuration file:
- DB_BACKUP_USER - A MySQL database user having the privileges to run the mysqldump command.
- DB_HOST - The hostname of the database server. In most cases, this would be localhost
- BKP_MAIL_RECIPIENT - An email id where the resulting log file should be sent to after the backup script completes.
The various parameters are delimited by the character ":".
A sample configuration file could be:
TEST_DIR_FULL:BKPFORI:FULL
TEST_DIR_FULL:BKPATTR1:Test
TEST_DIR_FULL:BKPATTR2:/user/test/httpd/web
TEST_DIR_FULL:BKPATTR3:/user/backup
TEST_DB_FULL:BKPTYPE:DB
TEST_DB_FULL:BKPFORI:FULL
TEST_DB_FULL:BKPATTR1:Test
TEST_DB_FULL:BKPATTR2:the_db_name
TEST_DB_FULL:BKPATTR3:/user/backup
DB_BACKUP_USER:backup_db_username
DB_HOST:servername
BKP_MAIL_RECIPIENT:emailid@example.com
Once the configuration file has been setup, running the script would follow the format as below:
Here:
- BackupName is a generic name that is given to the entire consolidated backup that is created.
- BackupParam1, BackupParam2,
are all individual Backup Parameters. They all need to have the 5 configuration parameters defined within the .databackup_config configuration file.
3. Setup access parameters for the database access
One of my key considerations while writing this script was that I did not want to have the database password included in the call to the mysqldump command. This was because there are chances that the command line parameters that are used while calling mysqldump could appear in server log files.
To better control this, I decided to use the approach suggested at Techie Corner to setup a mysqldump command without passing in the password using the -p option.
For setting up the access parameters for the database, please create a .my.cnf file within your home directory as follows:
user = backup_db_username
password = some_complicated_password
Please take care to ensure that the user mentioned in this file is the same as the one specified against the parameter DB_BACKUP_USER in step 2 above.
4. Setup the Dropbox Uploader script and configuration
If you want the functionality to upload the backup to Drobox, you will also need to download the scripts from Andrea's website. Andrea provides a script named dropbox_uploader.sh that is capable of uploading files from Linux to Dropbox. He also provides detailed setup instructions.
Using shell, manually run the dropbox_uploader.sh script once. This will prompt you through a series of configuration steps that help you setup the script as a Dropbox application on your Dropbox account; you will then authorize it to access your Dropbox account.
If you don't have shell access this can be more tricky. To get around this limitation, I edited the dropbox_uploader.sh script and piped commands at appropriate stages within the script to the mail program, thus mailing the output to myself. I then ran the script via cron and waited for the mail to arrive and then took the recommended steps. This approach however is not for everyone as it needs a fair degree of expertise in shell-scripting and technical know-how to understand what's happening.
5. Setting up a cron to run this script
The script by itself solves only half the problem. To be a truly automatic solution, it needs to be run without our intervention. To do this, we simply setup the script to run this as part of the crontab.
That's about it folks. If everything has been setup correctly, the directories and databases configured in Step 2 should get backed up into a TAR file such as BAKUP_MyWebBackup_20121125170101.tar and uploaded to your Dropbox account.
Feel free to use the scripts provided. If you have any questions, do leave a comment and I will do my best to answer them. If you have any comments about these scripts, I would love to hear them — do leave a comment.
Disclaimer:
At this stage, I would like to remind you that I am providing these scripts with no warranties. Please use your own judgement and discretion before downloading and making use of these scripts. I will not accept liabilities should anything unexpected happen.