Today I will show how to set up periodical incremental backups using rsync, crontab and sshfs utilities in Linux. The setup contains two PCs – a workstation using Ubuntu and a media server running Ubuntu Server. I will show how to set up scripts that run the daily backup task automatically. In the end of the tutorial we should achieve the following goals:
- The entire /home partition of the workstation is copied to the server every two hours
- All important directories on the server are copied to a dedicated backup hard drive
- Incremental backups are set up to store daily snapshot over a chosen number of days
- The amount of data to be copied over the network and stored is reduced by using rsync and hard links
- The backup method described here is not supposed to be bullet-proof, but enough to do the job
There is a hard drive installed in the server for storing backups. Storing backup on a dedicated drive in a network server reduces the possibility of losing data in case of disk fault, fire, theft etc. For this tutorial, let’s say that the backup drive is mounted as /backup on the server. I make daily backups on the server compared to 2-hour intervals on the workstation since the data stored there does not change so often and is of lower importance. Incremental backup stores identical files as hard links, therefore it does not take much more storage space and allows accessing daily data snapshots for up to 2 weeks back from today.
You can find a guide to sshfs at www.linuxjournal.com. Make sure that you can mount and access the directories on the backup server. Here is a mount script for the workstation that runs during user login:
#!/bin/bash sshfs -o ro -o idmap=user user@server:/backup/weekly /server/backup-weekly/ sshfs -o ro -o idmap=user user@server:/backup/latest /server/backup/
Notice that the backup directories are mounted read-only on the workstation in order to prevent accidental or malicious data deletion.
Create a script to copy the entire home partition to /backup/latest/home. Save it on the workstation in a location of your choice (/home/localuser/backup used in the tutorial). Add execution permission (chmod 700 backup) and change the owner to root (chown root:root backup).
#!/bin/bash sudo rsync --ignore-errors -ave ssh --delete /home user@server:/backup/latest/ > backup.log echo "Finished" >> backup.log
The script is scheduled to run every 2 hours as root using crontab (run sudo crontab -e to add a task). Root access is used because all users’ home directories are copied. Notice that rsync uses SSH directly and not the read-only backup mount, therefore there is no need to remount in read-write mode. Gnome-schedule GUI utility can be used to set up a periodical task instead of command-line crontab.
#do backup every 2 hours at 0th minute 0 */02 * * * /home/localuser/backup
Another script for the server part is written. It also runs as scheduled job.
Basically, the script rolls 14 daily backup snapshots in /backup/weekly by deleting the oldest, renaming others and creating the newest snapshot with hard links (notice the -al parameter to cp). Since the script runs only once a day, first all data is copied to /backup/latest, then during the next run (tomorrow) the contents of /backup/latest become /backup/weekly/today-1 (yesterday) and so on. The last part of the script syncs the server directories to /backup/latest and the script in the workstation does the same, only with two-hour interval. It is stored as /backup/backup. The script is only run by root, therefore the file permissions should be set accordingly.
#!/bin/bash rm -rf /backup/weekly/today-14 mv /backup/weekly/today-13 /backup/weekly/today-14 ..skipped... mv /backup/weekly/today-1 /backup/weekly/today-2 cp -al /backup/latest /backup/weekly/today-1 rsync -v -a --delete /ftp /backup/latest #add any other important directory here
# m h dom mon dow command 0 23 * * * /backup/backup