Tutorials

How to Backup and Restore a Website on VPS Linux

Administrator
By Administrator
Published Oct 03, 2025
11 min read
How to Backup and Restore a Website on VPS Linux

How to Backup and Restore a Website on VPS Linux

That moment when you realize your website is gone. Maybe it was a server crash, a malicious attack, or an accidental rm -rf command in the wrong directory. Whatever the cause, the sinking feeling in your stomach is universal.

I learned this lesson the hard way when I lost six months of work on my first major project. I had backups... in theory. The backup script I wrote "someday" never materialized, and the manual backups I promised myself I'd do every week somehow never happened.

Don't let this happen to you. In this comprehensive guide, I'll show you how to set up reliable backups for your VPS-hosted website and, more importantly, how to restore them when disaster strikes. We'll cover everything from simple file backups to complete system restores.

Let's build your digital safety net! 🛡️

Why Website Backups Are Non-Negotiable

Before diving into the technical details, let's understand why backups are crucial:

  • Human error: We all make mistakes—accidentally deleting files, misconfiguring servers, or updating plugins that break everything
  • Hardware failures: SSDs fail, memory corrupts, and power supplies die
  • Security incidents: Malware infections, hacking attempts, and ransomware attacks
  • Software updates: Sometimes updates break compatibility or introduce bugs
  • Migration needs: Moving to a new server or hosting provider

Types of Website Backups

Not all backups are created equal. Understanding the different types helps you choose the right strategy:

Full Backups - Complete copy of all files and databases - Easiest to restore but most storage-intensive

Incremental Backups - Only backs up changes since the last backup - Storage-efficient but more complex to restore

File Backups - Website files, themes, plugins, and uploads - Essential for static sites and media-heavy sites

Database Backups - All your content, user data, and settings - Critical for dynamic sites like WordPress or Laravel applications

Step-by-Step Backup and Restore Guide

Step 1: Assess Your Website Structure

Before creating backups, understand what needs to be backed up on your website.

Typical Website Components:

# Common web server paths
/var/www/yourdomain.com/          # Website files
/etc/nginx/sites-available/       # Nginx configuration
/etc/apache2/sites-available/     # Apache configuration
/etc/letsencrypt/                  # SSL certificates
/var/log/nginx/                    # Nginx logs
/home/user/.config/               # Application configs

# Database information
mysql -u root -p -e "SHOW DATABASES;"

Create a Backup Manifest:

mkdir -p ~/backup-info
nano ~/backup-info/website-manifest.txt

Document your website structure:

Website: yourdomain.com
Web root: /var/www/yourdomain.com
Database: yourdomain_db
Database user: yourdomain_user
Web server: Nginx
SSL cert location: /etc/letsencrypt/live/yourdomain.com/
Config files: /etc/nginx/sites-available/yourdomain.com
Important directories: uploads, themes, plugins
Backup frequency: Daily database, weekly full

Step 2: Backing Up Website Files

Let's start with the most straightforward part—backing up your website files.

Manual File Backup:

# Create backup directory
mkdir -p ~/backups/files

# Backup website files
tar -czf ~/backups/files/website-$(date +%Y%m%d_%H%M%S).tar.gz -C /var/www/ yourdomain.com/

# Backup configuration files
tar -czf ~/backups/files/config-$(date +%Y%m%d_%H%M%S).tar.gz /etc/nginx/sites-available/yourdomain.com /etc/letsencrypt/live/yourdomain.com/

# List backups
ls -la ~/backups/files/

Automated File Backup Script:

nano ~/backup-scripts/backup-files.sh
#!/bin/bash
# backup-files.sh

BACKUP_DIR="/home/user/backups/files"
WEBSITE_DIR="/var/www/yourdomain.com"
DOMAIN="yourdomain.com"
DATE=$(date +%Y%m%d_%H%M%S)

# Create backup directory if it doesn't exist
mkdir -p $BACKUP_DIR

# Backup website files
echo "Backing up website files..."
tar -czf $BACKUP_DIR/website-$DATE.tar.gz -C /var/www/ $DOMAIN

# Backup configuration files
echo "Backing up configuration files..."
tar -czf $BACKUP_DIR/config-$DATE.tar.gz /etc/nginx/sites-available/$DOMAIN /etc/letsencrypt/live/$DOMAIN/

# Keep only last 30 days of backups
echo "Cleaning up old backups..."
find $BACKUP_DIR -name "*.tar.gz" -mtime +30 -delete

echo "File backup completed: $DATE"
ls -la $BACKUP_DIR/

Make it executable:

chmod +x ~/backup-scripts/backup-files.sh

Test the Script:

~/backup-scripts/backup-files.sh

Step 3: Backing Up Databases

For dynamic websites, database backups are crucial.

Manual Database Backup:

# Create backup directory
mkdir -p ~/backups/database

# Backup MySQL/MariaDB database
mysqldump -u yourdbuser -p'yourpassword' yourdatabase > ~/backups/database/database-$(date +%Y%m%d_%H%M%S).sql

# Compress the backup
gzip ~/backups/database/database-$(date +%Y%m%d_%H%M%S).sql

# List database backups
ls -la ~/backups/database/

Automated Database Backup Script:

nano ~/backup-scripts/backup-database.sh
#!/bin/bash
# backup-database.sh

BACKUP_DIR="/home/user/backups/database"
DB_USER="yourdbuser"
DB_PASS="yourpassword"
DB_NAME="yourdatabase"
DATE=$(date +%Y%m%d_%H%M%S)

# Create backup directory if it doesn't exist
mkdir -p $BACKUP_DIR

# Backup database
echo "Backing up database..."
mysqldump -u $DB_USER -p$DB_PASS $DB_NAME > $BACKUP_DIR/database-$DATE.sql

# Compress backup
gzip $BACKUP_DIR/database-$DATE.sql

# Keep only last 30 days of backups
echo "Cleaning up old backups..."
find $BACKUP_DIR -name "*.sql.gz" -mtime +30 -delete

echo "Database backup completed: $DATE"
ls -la $BACKUP_DIR/

Make it executable:

chmod +x ~/backup-scripts/backup-database.sh

Multiple Database Backup:

If you have multiple databases:

#!/bin/bash
# backup-all-databases.sh

BACKUP_DIR="/home/user/backups/database"
DB_USER="root"
DB_PASS="yourpassword"
DATE=$(date +%Y%m%d_%H%M%S)

mkdir -p $BACKUP_DIR

# Get list of databases (excluding system databases)
DATABASES=$(mysql -u $DB_USER -p$DB_PASS -e "SHOW DATABASES;" | grep -Ev "Database|information_schema|performance_schema|mysql|sys")

# Backup each database
for DB in $DATABASES; do
    echo "Backing up database: $DB"
    mysqldump -u $DB_USER -p$DB_PASS $DB > $BACKUP_DIR/$DB-$DATE.sql
    gzip $BACKUP_DIR/$DB-$DATE.sql
done

echo "All databases backup completed: $DATE"

Step 4: Setting Up Automated Backups

Now let's automate your backups using cron jobs.

Set Up Cron Jobs:

crontab -e

Add these lines for automated backups:

# Backup files daily at 2 AM
0 2 * * * /home/user/backup-scripts/backup-files.sh

# Backup database daily at 2:30 AM
30 2 * * * /home/user/backup-scripts/backup-database.sh

# Clean up old files weekly at 3 AM
0 3 * * 0 /home/user/backup-scripts/cleanup-old-backups.sh

Create Cleanup Script:

nano ~/backup-scripts/cleanup-old-backups.sh
#!/bin/bash
# cleanup-old-backups.sh

BACKUP_DIR="/home/user/backups"

# Remove backups older than 30 days
echo "Cleaning up backups older than 30 days..."
find $BACKUP_DIR -name "*.tar.gz" -mtime +30 -delete
find $BACKUP_DIR -name "*.sql.gz" -mtime +30 -delete

echo "Cleanup completed"

Make it executable:

chmod +x ~/backup-scripts/cleanup-old-backups.sh

Test Cron Jobs:

# Check if cron is running
sudo systemctl status cron

# Test a specific cron job
run-parts --test /etc/cron.daily

# View cron logs
tail -f /var/log/cron.log

Step 5: Offsite Backup Strategy

Backups on the same server aren't true backups. Let's set up offsite backups.

Remote Server Backup with rsync:

#!/bin/bash
# backup-remote.sh

LOCAL_BACKUP_DIR="/home/user/backups"
REMOTE_SERVER="[email protected]"
REMOTE_DIR="/remote/backups/yourdomain.com"

# Sync backups to remote server
echo "Syncing backups to remote server..."
rsync -avz --delete $LOCAL_BACKUP_DIR/ $REMOTE_SERVER:$REMOTE_DIR/

echo "Remote backup completed"

Cloud Storage Backup (using rclone):

Install rclone:

curl -s https://rclone.org/install.sh | sudo bash

Configure cloud storage:

rclone config

Create cloud backup script:

#!/bin/bash
# backup-cloud.sh

LOCAL_BACKUP_DIR="/home/user/backups"
CLOUD_REMOTE="yourcloud"  # Configured in rclone config

# Sync to cloud storage
echo "Syncing backups to cloud storage..."
rclone sync $LOCAL_BACKUP_DIR $CLOUD_REMOTE:backups/yourdomain.com/

echo "Cloud backup completed"

Step 6: Complete System Backup

For comprehensive disaster recovery, create full system backups.

System Backup with rsync:

#!/bin/bash
# backup-system.sh

BACKUP_DIR="/home/user/backups/system"
EXCLUDE_DIRS="--exclude=/proc --exclude=/sys --exclude=/dev --exclude=/tmp --exclude=/run --exclude=/mnt --exclude=/media --exclude=/home/user/backups"
DATE=$(date +%Y%m%d_%H%M%S)

mkdir -p $BACKUP_DIR

# Create full system backup
echo "Creating full system backup..."
rsync -aAXv $EXCLUDE_DIRS / $BACKUP_DIR/system-$DATE/

echo "System backup completed: $DATE"

Filesystem Snapshot:

If using LVM or Btrfs filesystems, you can create snapshots:

# LVM snapshot
sudo lvcreate --size 1G --snapshot --name backup-snapshot /dev/vg00/root

# Mount snapshot
sudo mkdir /mnt/snapshot
sudo mount /dev/vg00/backup-snapshot /mnt/snapshot

# Backup from snapshot
tar -czf ~/backups/system/snapshot-$(date +%Y%m%d_%H%M%S).tar.gz -C /mnt/snapshot .

# Unmount and remove snapshot
sudo umount /mnt/snapshot
sudo lvremove /dev/vg00/backup-snapshot

Step 7: Restoring from Backups

Now let's learn how to restore your website when disaster strikes.

Restoring Website Files:

# Find your latest backup
ls -la ~/backups/files/website-*.tar.gz

# Extract website files
tar -xzf ~/backups/files/website-20231215_143022.tar.gz -C /tmp/

# Restore to web directory
sudo cp -r /tmp/yourdomain.com/* /var/www/yourdomain.com/

# Set proper permissions
sudo chown -R www-data:www-data /var/www/yourdomain.com/
sudo chmod -R 755 /var/www/yourdomain.com/

Restoring Configuration Files:

# Extract configuration backup
tar -xzf ~/backups/files/config-20231215_143022.tar.gz -C /tmp/

# Restore Nginx configuration
sudo cp /tmp/etc/nginx/sites-available/yourdomain.com /etc/nginx/sites-available/
sudo ln -s /etc/nginx/sites-available/yourdomain.com /etc/nginx/sites-enabled/

# Restore SSL certificates
sudo cp -r /tmp/etc/letsencrypt/live/yourdomain.com/* /etc/letsencrypt/live/yourdomain.com/

# Test Nginx configuration
sudo nginx -t
sudo systemctl reload nginx

Restoring Database:

# List available database backups
ls -la ~/backups/database/database-*.sql.gz

# Decompress the latest backup
gunzip -c ~/backups/database/database-20231215_143022.sql.gz > ~/backups/database/latest.sql

# Create database if it doesn't exist
mysql -u root -p -e "CREATE DATABASE IF NOT EXISTS yourdatabase;"

# Restore database
mysql -u yourdbuser -p'yourpassword' yourdatabase < ~/backups/database/latest.sql

# Clean up
rm ~/backups/database/latest.sql

Step 8: Complete Disaster Recovery Scenario

Let's walk through a complete website restoration from scratch.

Scenario: Server Failure, New Server Setup:

# 1. Set up new server (install LEMP/LAMP stack)
sudo apt update
sudo apt install nginx mysql-server php-fpm php-mysql

# 2. Download backups from remote location
# If using rsync:
rsync -avz [email protected]:/remote/backups/yourdomain.com/ ~/backups/

# 3. Restore website files
tar -xzf ~/backups/files/website-latest.tar.gz -C /tmp/
sudo cp -r /tmp/yourdomain.com/* /var/www/yourdomain.com/
sudo chown -R www-data:www-data /var/www/yourdomain.com/

# 4. Restore configuration
tar -xzf ~/backups/files/config-latest.tar.gz -C /tmp/
sudo cp /tmp/etc/nginx/sites-available/yourdomain.com /etc/nginx/sites-available/
sudo ln -s /etc/nginx/sites-available/yourdomain.com /etc/nginx/sites-enabled/

# 5. Restore SSL certificates
sudo cp -r /tmp/etc/letsencrypt/live/yourdomain.com/* /etc/letsencrypt/live/yourdomain.com/

# 6. Create database
mysql -u root -p -e "CREATE DATABASE yourdatabase; CREATE USER 'yourdbuser'@'localhost' IDENTIFIED BY 'yourpassword'; GRANT ALL PRIVILEGES ON yourdatabase.* TO 'yourdbuser'@'localhost'; FLUSH PRIVILEGES;"

# 7. Restore database
gunzip -c ~/backups/database/database-latest.sql.gz | mysql -u yourdbuser -p'yourpassword' yourdatabase

# 8. Test and restart services
sudo nginx -t
sudo systemctl restart nginx php8.1-fpm mysql

# 9. Verify website is working
curl -I http://yourdomain.com

Step 9: Monitoring and Verification

Regular monitoring ensures your backups are working correctly.

Backup Verification Script:

#!/bin/bash
# verify-backups.sh

BACKUP_DIR="/home/user/backups"
MIN_SIZE=1048576  # 1MB minimum backup size
ALERT_EMAIL="[email protected]"

# Check if recent backups exist
echo "Verifying backups..."

# Check file backups
if find $BACKUP_DIR/files -name "*.tar.gz" -mtime -1 | read; then
    echo "âś“ Recent file backups found"
else
    echo "âś— No recent file backups found"
    echo "Backup verification failed - no recent file backups" | mail -s "Backup Alert" $ALERT_EMAIL
fi

# Check database backups
if find $BACKUP_DIR/database -name "*.sql.gz" -mtime -1 | read; then
    echo "âś“ Recent database backups found"
else
    echo "âś— No recent database backups found"
    echo "Backup verification failed - no recent database backups" | mail -s "Backup Alert" $ALERT_EMAIL
fi

# Check backup sizes
find $BACKUP_DIR -name "*.tar.gz" -o -name "*.sql.gz" | while read backup; do
    size=$(stat -c%s "$backup")
    if [ $size -lt $MIN_SIZE ]; then
        echo "âš  Warning: Small backup file: $backup ($size bytes)"
    fi
done

echo "Backup verification completed"

Test Restore Procedure:

#!/bin/bash
# test-restore.sh

# Create test environment
mkdir -p /tmp/test-restore
TEST_DB="test_restore_$(date +%s)"

# Extract latest backup to test location
tar -xzf ~/backups/files/website-$(date +%Y%m%d)*.tar.gz -C /tmp/test-restore/

# Test database backup
echo "Creating test database..."
mysql -u root -p -e "CREATE DATABASE $TEST_DB;"

echo "Restoring database to test..."
gunzip -c ~/backups/database/database-$(date +%Y%m%d)*.sql.gz | mysql -u yourdbuser -p'yourpassword' $TEST_DB

# Verify restore
echo "Verifying test restore..."
if [ -d "/tmp/test-restore/yourdomain.com" ]; then
    echo "âś“ File restore test passed"
else
    echo "âś— File restore test failed"
fi

if mysql -u yourdbuser -p'yourpassword' -e "USE $TEST_DB; SHOW TABLES;" | grep -q "wp_posts"; then
    echo "âś“ Database restore test passed"
else
    echo "âś— Database restore test failed"
fi

# Cleanup
mysql -u root -p -e "DROP DATABASE $TEST_DB;"
rm -rf /tmp/test-restore

echo "Test restore completed"

Step 10: Advanced Backup Strategies

Incremental Backups with rsync:

#!/bin/bash
# backup-incremental.sh

SOURCE="/var/www/yourdomain.com"
BACKUP_DIR="/home/user/backups/incremental"
DATE=$(date +%Y%m%d)

# Create backup directory
mkdir -p $BACKUP_DIR/$DATE

# Perform incremental backup
rsync -av --link-dest=$BACKUP_DIR/latest/ $SOURCE/ $BACKUP_DIR/$DATE/

# Update latest symlink
rm -f $BACKUP_DIR/latest
ln -s $BACKUP_DIR/$DATE $BACKUP_DIR/latest

echo "Incremental backup completed: $DATE"

Encrypted Backups:

#!/bin/bash
# backup-encrypted.sh

BACKUP_DIR="/home/user/backups/encrypted"
ENCRYPTION_KEY="your-secure-encryption-key"
DATE=$(date +%Y%m%d_%H%M%S)

# Create backup
tar -czf /tmp/backup-$DATE.tar.gz -C /var/www/ yourdomain.com
mysqldump -u yourdbuser -p'yourpassword' yourdatabase > /tmp/database-$DATE.sql

# Encrypt backup
openssl enc -aes-256-cbc -salt -in /tmp/backup-$DATE.tar.gz -out $BACKUP_DIR/backup-$DATE.enc -k $ENCRYPTION_KEY
openssl enc -aes-256-cbc -salt -in /tmp/database-$DATE.sql -out $BACKUP_DIR/database-$DATE.enc -k $ENCRYPTION_KEY

# Cleanup
rm -f /tmp/backup-$DATE.tar.gz /tmp/database-$DATE.sql

echo "Encrypted backup completed: $DATE"

Version Control Integration:

For configuration files and small websites, consider using Git:

# Initialize Git repository in website directory
cd /var/www/yourdomain.com
git init

# Add .gitignore
cat > .gitignore << EOF
*.log
cache/
tmp/
node_modules/
uploads/
.DS_Store
Thumbs.db
EOF

# Initial commit
git add .
git commit -m "Initial backup"

# Create remote repository (GitHub, GitLab, etc.)
git remote add origin [email protected]:youruser/yourdomain-backup.git
git push -u origin main

# Create backup script
nano ~/backup-scripts/git-backup.sh
#!/bin/bash
# git-backup.sh

WEBSITE_DIR="/var/www/yourdomain.com"
BACKUP_MESSAGE="Automated backup: $(date)"

# Navigate to website directory
cd $WEBSITE_DIR

# Add all changes
git add .

# Commit changes
git commit -m "$BACKUP_MESSAGE"

# Push to remote repository
git push origin main

echo "Git backup completed"

Common Backup Mistakes to Avoid

❌ Not testing restores - Always test your backup restoration process

❌ Keeping all backups on the same server - Use offsite storage for disaster recovery

❌ Ignoring backup logs and errors - Monitor backup success/failure notifications

❌ Not backing up frequently enough - Daily for most sites, more frequent for high-traffic sites

❌ Forgetting configuration files - Include Nginx/Apache configs, SSL certificates, and cron jobs

❌ Not documenting backup procedures - Create clear documentation for restore procedures

Best Practices for Website Backups

  • 3-2-1 Rule: 3 copies of data, 2 different media, 1 offsite
  • Regular testing: Test restore procedures monthly
  • Automated monitoring: Set up alerts for backup failures
  • Documentation: Document your backup and restore procedures
  • Encryption: Encrypt sensitive backup data
  • Retention policy: Keep backups for appropriate time periods
  • Version control: Use Git for configuration management

Final Thoughts

Backups are like insurance—you hope you never need them, but you're incredibly grateful when you do. The time you invest in setting up reliable backups will pay for itself the first time you need to restore your website.

Remember these key principles:

  • Automate everything - manual backups get forgotten
  • Test your restores - untested backups are no backups at all
  • Multiple locations - onsite for quick restores, offsite for disaster recovery
  • Regular monitoring - know when backups fail

Start with the basic backup scripts we've covered, test them thoroughly, and gradually add more sophisticated features as your needs grow. Your future self will thank you when disaster strikes and you can restore your website with confidence.

Happy backing up! 🛡️ Your website's safety net is now in place.

Related Articles

Setting Up Load Balancing with Nginx for High Traffic Sites

Setting Up Load Balancing with Nginx for High Traffic Sites

Oct 03, 2025

Setting Up Load Balancing with Nginx for High Traffic Sites Your website is growing. Traffic is i...

How to Monitor Server Resources with htop and netstat

How to Monitor Server Resources with htop and netstat

Oct 03, 2025

How to Monitor Server Resources with htop and netstat Ever wonder why your website suddenly slows...

Basic Firewall Configuration for Linux Web Servers

Basic Firewall Configuration for Linux Web Servers

Oct 03, 2025

Basic Firewall Configuration for Linux Web Servers Your web server is like a house in a busy neig...

How to Add a New Domain to Your Nginx Server

How to Add a New Domain to Your Nginx Server

Oct 03, 2025

How to Add a New Domain to Your Nginx Server So you've got your Nginx server running smoothly wit...