Step-by-step guide to creating automated, incremental backups on Ubuntu using rsync and cron. Covers local backups, remote server backups, rotation policies, and email notifications.
rsync is the gold standard for Linux backups because it:
Combined with cron for scheduling, you get a reliable, zero-cost backup system.
The fundamental rsync backup command:
rsync -avz --delete /home/user/documents/ /mnt/backup/documents/
Flags explained:
-a (archive) — preserves permissions, timestamps, symlinks, etc.-v (verbose) — shows files being transferred-z (compress) — compresses data during transfer--delete — removes files from backup that were deleted from sourcedocuments/) means "copy the contents of this directory." Without it, rsync would create a documents subdirectory inside the destination.Create a reusable backup script at /usr/local/bin/backup.sh:
#!/bin/bash
set -euo pipefail
# Configuration
SOURCE_DIRS=(
"/home/user/documents"
"/home/user/projects"
"/etc"
)
BACKUP_ROOT="/mnt/backup"
LOG_FILE="/var/log/backup.log"
DATE=$(date +%Y-%m-%d_%H%M)
# Start logging
echo "=== Backup started: $DATE ===" >> "$LOG_FILE"
for dir in "${SOURCE_DIRS[@]}"; do
dirname=$(basename "$dir")
dest="$BACKUP_ROOT/$dirname"
mkdir -p "$dest"
rsync -az --delete \
--exclude=".cache" \
--exclude="node_modules" \
--exclude=".git" \
"$dir/" "$dest/" \
>> "$LOG_FILE" 2>&1
echo " Backed up: $dir -> $dest" >> "$LOG_FILE"
done
echo "=== Backup finished: $(date +%Y-%m-%d_%H%M) ===" >> "$LOG_FILE"
Make it executable:
sudo chmod +x /usr/local/bin/backup.sh
To back up to a remote server, use rsync over SSH:
rsync -avz --delete \
-e "ssh -p 22 -i ~/.ssh/backup_key" \
/home/user/documents/ \
backupuser@192.168.1.100:/backups/documents/
Setting up SSH key authentication (so cron can run without a password prompt):
# Generate a dedicated backup key
ssh-keygen -t ed25519 -f ~/.ssh/backup_key -N ""
# Copy it to the remote server
ssh-copy-id -i ~/.ssh/backup_key backupuser@192.168.1.100
# Test the connection
ssh -i ~/.ssh/backup_key backupuser@192.168.1.100 echo "Connected!"
Open the crontab editor:
crontab -e
Add a schedule. Common patterns:
# Daily at 2:00 AM
0 2 * * * /usr/local/bin/backup.sh
# Every 6 hours
0 */6 * * * /usr/local/bin/backup.sh
# Weekdays at 11 PM
0 23 * * 1-5 /usr/local/bin/backup.sh
# Sunday at 3 AM (weekly full backup)
0 3 * * 0 /usr/local/bin/backup.sh
Verify your crontab was saved:
crontab -l
To prevent your backup drive from filling up, add rotation to keep only the last 7 daily backups:
#!/bin/bash
BACKUP_ROOT="/mnt/backup"
DAILY_DIR="$BACKUP_ROOT/daily"
MAX_BACKUPS=7
TODAY=$(date +%Y-%m-%d)
DEST="$DAILY_DIR/$TODAY"
# Create today's backup using hard links from yesterday (saves space)
LATEST=$(ls -1d "$DAILY_DIR"/2* 2>/dev/null | tail -1)
if [ -n "$LATEST" ]; then
cp -al "$LATEST" "$DEST"
fi
# Sync changes
rsync -az --delete /home/user/documents/ "$DEST/documents/"
# Remove old backups beyond MAX_BACKUPS
ls -1d "$DAILY_DIR"/2* | head -n -$MAX_BACKUPS | xargs rm -rf
echo "Backup complete: $DEST (keeping last $MAX_BACKUPS)"
This approach uses hard links (cp -al) so unchanged files do not consume extra disk space.
Just rsync in the other direction: rsync -avz /mnt/backup/documents/ /home/user/documents/. Or copy individual files with cp.
The first backup copies everything. Subsequent backups only transfer changed files, typically using 1-5% of the full backup size.
Can I back up to cloud storage like S3?rsync does not support S3 directly. Use rclone instead, which has a similar syntax but supports S3, Google Drive, Dropbox, and 40+ cloud providers.
rsync is safe — a failed backup leaves the previous backup intact. Add error handling and email notifications to your script to be alerted of failures.