BASH
Automate Daily Website/Project Backups
Create a robust bash script to automate backing up website files and, optionally, a database to a timestamped archive, essential for disaster recovery.
#!/bin/bash
# Configuration
BACKUP_DIR="/var/backups/webprojects"
PROJECT_PATH="/var/www/mywebsite"
DB_NAME="my_database_name"
DB_USER="db_user"
DB_PASS="db_password"
# Create backup directory if it doesn't exist
mkdir -p "$BACKUP_DIR"
# Get current date for filename
DATE=$(date +%Y%m%d%H%M%S)
BACKUP_FILENAME="mywebsite_backup_$DATE.tar.gz"
# Backup website files
tar -czf "$BACKUP_DIR/$BACKUP_FILENAME" -C "$(dirname "$PROJECT_PATH")" "$(basename "$PROJECT_PATH")"
# Optional: Backup database (uncomment and configure if needed)
# mysqldump -u "$DB_USER" -p"$DB_PASS" "$DB_NAME" | gzip > "$BACKUP_DIR/mywebsite_db_backup_$DATE.sql.gz"
# Clean up old backups (e.g., keep last 7 days)
find "$BACKUP_DIR" -type f -name "mywebsite_backup_*.tar.gz" -mtime +7 -delete
# find "$BACKUP_DIR" -type f -name "mywebsite_db_backup_*.sql.gz" -mtime +7 -delete
echo "Backup completed: $BACKUP_FILENAME"
How it works: This script automates creating timestamped backups of a website's files. It uses `tar` to compress the project directory and `gzip` for further compression. It also includes commented-out lines for backing up a MySQL database using `mysqldump` and clean-up of old backups, helping to keep disk space in check. This script is typically scheduled via a cron job for regular execution.