BASH
Automatically Purging Old Log Files
Create a Bash script to automatically identify and delete log files older than a specified number of days, helping manage disk space on web servers.
#!/bin/bash
# --- Configuration ---
LOG_DIR="/var/log/nginx" # Directory containing log files
DAYS_TO_KEEP="30" # Files older than this will be deleted
echo "Cleaning up log files older than ${DAYS_TO_KEEP} days in ${LOG_DIR}..."
# Use find to locate files and delete them
# -type f: Only consider regular files
# -name "*.log": Only consider files ending with .log (or adjust as needed)
# -mtime +${DAYS_TO_KEEP}: Files modified more than DAYS_TO_KEEP days ago
# -delete: Delete the found files
find "${LOG_DIR}" -type f -name "*.log" -mtime +${DAYS_TO_KEEP} -print -delete
if [ $? -eq 0 ]; then
echo "Log cleanup completed successfully."
else
echo "Warning: Log cleanup encountered issues or no old files were found." >&2
fi
# Optional: Also clean up compressed logs if they exist
find "${LOG_DIR}" -type f -name "*.log.gz" -mtime +${DAYS_TO_KEEP} -print -delete
if [ $? -eq 0 ]; then
echo "Compressed log cleanup completed successfully."
else
echo "Warning: Compressed log cleanup encountered issues or no old files were found." >&2
fi
How it works: This script is designed to automate the deletion of old log files, preventing them from consuming excessive disk space. It defines a `LOG_DIR` and `DAYS_TO_KEEP` variable. The `find` command is then used to search within the specified directory for regular files (`-type f`) that match a certain name pattern (`-name "*.log"`) and were last modified more than `DAYS_TO_KEEP` days ago (`-mtime +${DAYS_TO_KEEP}`). The `-print` option shows which files are being deleted before `-delete` removes them. An optional section for compressed logs (`.log.gz`) is also included.