BASH
Advanced Log Filtering and Error Extraction
Use this Bash snippet to efficiently filter log files, extract specific error patterns, and count their occurrences for quick debugging and system health checks.
#!/bin/bash
LOG_FILE="/var/log/apache2/error.log"
ERROR_PATTERN="PHP Fatal error|CRITICAL|ERROR:"
if [ ! -f "$LOG_FILE" ]; then
echo "Error: Log file '$LOG_FILE' not found."
exit 1
fi
echo "--- Recent Errors from $LOG_FILE ---"
grep -E "$ERROR_PATTERN" "$LOG_FILE" | tail -n 20
echo "
--- Error Summary ---"
grep -E "$ERROR_PATTERN" "$LOG_FILE" | awk '{
gsub(/ERROR: /,"");
gsub(/PHP Fatal error: /,"");
gsub(/\[client .*?\] /,"");
print
}' | sort | uniq -c | sort -nr
How it works: This script filters a specified log file for common error patterns (`grep -E`), displays the last 20 occurrences for immediate review, and then provides a summarized count of unique error messages. It uses `awk` to clean up repetitive log prefixes before `sort` and `uniq -c` aggregate the error output for a clearer overview.