← Back to all snippets
BASH

Extract Unique IPs and Request Counts from Access Logs

Analyze web server access logs using Bash to identify unique visitor IP addresses, count their requests, and gain insights into traffic patterns and potential security issues.

#!/bin/bash

# Usage: analyze_access_log.sh <log_file_path>
# Example: analyze_access_log.sh /var/log/nginx/access.log

if [ -z "$1" ]; then
    echo "Usage: $0 <log_file_path>"
    echo "Example: $0 /var/log/apache2/access.log"
    exit 1
fi

LOG_FILE="$1"

if [ ! -f "$LOG_FILE" ]; then
    echo "Error: Log file '$LOG_FILE' not found."
    exit 1
fi

echo "Analyzing unique IPs and request counts from '$LOG_FILE'
"

# This assumes a common log format where IP is the first field.
# 1. Extract the first field (IP address) using awk or cut.
# 2. Sort the IPs to group identical ones together.
# 3. Count unique occurrences using uniq -c.
# 4. Sort by count in reverse numerical order for most frequent IPs.

cat "$LOG_FILE" | awk '{print $1}' | sort | uniq -c | sort -rn

echo "
Analysis complete. The list shows 'count IP_address'."
How it works: This script is a powerful tool for web developers and system administrators to quickly analyze web server access logs. By piping the log file content through `awk`, `sort`, and `uniq`, it efficiently extracts all IP addresses, counts the occurrences of each unique IP, and then sorts them by their request count in descending order. This allows for immediate identification of the most active visitors, potential bots, or distributed denial-of-service (DDoS) attempts, providing valuable insights into traffic patterns and security posture.

Need help integrating this into your project?

Our team of expert developers can help you build your custom application from scratch.

Hire DigitalCodeLabs