How to Schedule a Backup of All Websites Hosted on Ubuntu

Submitted by andy on Sun, 12/11/2016 - 04:04
backup-websites-directory-structure

In this guide, you will learn how to create a shell script that will backup all of your websites hosted on Ubuntu 16.04. You will also learn how to setup a scheduled task to run the backup script daily.

Introduction

You may want to make regular backups of your websites just incase you get a virus or someone makes an irreversible change. In this guide, you will learn how to make a simple script that can be scheduled to archive multiple days worth of website backups. The guide assumes you have a directory structure like the following:

websites directory stucture

If you haven’t stored all your websites in /var/www you can still use this script as the location can be changed but you will need to have each site in the same root folder.

Step 1: Create the Backup Script

The first thing we need to do is create a simple backup script that will backup websites located in /var/www. The script will change into the directory and then loop through archiving each folder into /backups/www. It will also be able to cleanup backups older than X days.

Create a folder to store your backup script in. I suggest /scripts for this example:

sudo mkdir /scripts

Create a file called www-backup.sh inside the scripts folder:

sudo vim /scripts/www-backup.sh

Add the following code to the file and save it:

#!/bin/bash
#----------------------------------------
# OPTIONS
#----------------------------------------
DAYS_TO_KEEP=7    # 0 to keep forever
WWW_PATH='/var/www'
BACKUP_PATH='/backups/www'
#----------------------------------------

# Create the backup folder
if [ ! -d $BACKUP_PATH ]; then
  mkdir -p $BACKUP_PATH
fi

# change into the web root directory
cd "$WWW_PATH"
if [ "$(pwd)" != "$WWW_PATH" ] ; then
  echo "Failed to change directory to root of web path"
  exit
fi

for website in * ; do
  if [[ -d $website && ! -L "$website" ]]; then
    echo "Found website folder: $website"
    date=$(date -I)
    tar -cvpzf $BACKUP_PATH/$date-$website.tar.gz $website
  fi
done

# Delete old backups
if [ "$DAYS_TO_KEEP" -gt 0 ] ; then
  echo "Deleting backups older than $DAYS_TO_KEEP days"
  find $BACKUP_PATH/* -mtime +$DAYS_TO_KEEP -exec rm {} \;
fi

You will notice 3 configurable options at the beginning of this script. If you are running the script on Ubuntu 16.04, you shouldn’t need to edit anything. However, you may want to change the backup location and the number of days worth of backups to keep.

Once you have modified the options, make the script executable with the following command:

sudo chmod +x www-backup.sh

You can now test the backup script by running:

sudo ./www-backup.sh

After running the script you should see the sites have all been backed up to /backups/www. See below: backups directory

Step 2: Create the Scheduled Task

Now we will schedule the backup script to be run daily. We will do this by adding a call to the www-backup.sh script to the root crontab.

Run the following command to open the root crontab file:

sudo crontab -e

Now add to the last line of the file the following:

@daily sh /scripts/www-backup.sh >> /var/log/www-backup.log 2>&1

Save the file and then wait for the script to run. You can check for errors in /var/log/www-backup.log. Once you are happy it is working, you can remove the >> /var/log/www-backup.log 2>&1 from the crontab file.

Conclusion

That’s it. You should now have a daily backup of all your websites. You can change the DAYS_TO_KEEP option if you want more/less backups. If you have any questions or you have a better way of backing up websites on Ubuntu, leave a comment below.