We use cookies
We use cookies to optimize our website. By continuing to browse the site, you agree to our use of cookies.
Control panel
  • Русский
  • Українська
  • English
  • UAH
  • USD
  • RUB
  • EUR
  • 0-800-307-307 Hotline
  • +38 (044) 392-74-33 Kiev
  • +38 (057) 728-39-00 Kharkiv
  • +38 (056) 794-38-31 Dnipro
  • +38 (032) 229-58-93 Lviv
  • +38 (048) 738-57-70 Odessa
  • +38(093) 170-15-42  Life
  • +38 (067) 400-88-44 Kievstar
  • +1(888)393-24-51  USA, Toll free
  • +44(131)507-01-14  Great Britain
  • +7 (499) 348-28-61 Moscow

2.25.4. Analysis of logs with console commands

You can analyze logs using console commands on a hosting or on a local PC:

To analyze logs using console commands on the hosting, do the following:
  1. Download logs on your PC.
  2. Via filemanager or any FTPclient upload the downloaded logs to the hosting to the root directory of the hosting account.
  3. If the logs are in an archive file, unzip them using the file manager.
  4. Use in terminal console commandspresented below.
To analyze logs using console commands on a local PC, do the following:
  1. Download logs on your PC.
  2. If the logs are in an archive file, unzip them.
  3. Start the terminal on your PC:
  4. Use in terminal console commandspresented below.

Note

In all commands, instead of access.log specify the name of the downloaded log file or the full path to it.

Server response codes in descending order of their number:

awk '{print $9}' access.log | sort | uniq -c | sort -r

25 most active IPs:

cat access.log | awk '{ print $1 }' | sort | uniq -c | sort -rn | head -n 25

The number of requests from each IP in descending order:

cat access.log | awk '{print "requests from " $1}' | sort | uniq -c | sort -r

10 Most Popular Referer:

cat access.log | awk -F \" ' { print $4 } ' | grep -v '-' | sort | uniq -c | sort -rn | head -n 10

Top 10 User-Agents:

cat access.log | awk -F \" ' { print $6 } ' | sort | uniq -c | sort -rn | head -n 10

Total number of requests per day:

awk '{print $4}' access.log | cut -d: -f1 | uniq -c

Hourly number of requests per day:

  • If the log contains information for only one day:
    cat access.log | cut -d [ -f2 | cut -d] -f1 | awk -F: '{print $2":00"}' | sort -n | uniq -c
  • If the log contains information for several days (instead of DD/Mon substitute the desired day of the month and the first three letters of the month name in English):
    grep "DD/Mon" access.log | cut -d [ -f2 | cut -d] -f1 | awk -F: '{print $2":00"}' | sort -n | uniq -c

Per-minute number of requests for the specified hour of the specified day (instead of DD/Mon/YEAR:HH substitute the desired day of the month, the first three letters of the name of the month in English, year and hour):

grep "DD/Mon/YEAR:HH" access.log | cut -d [ -f2 | cut -d] -f1 | awk -F: '{print $2":"$3}' | sort -nk1 -nk2 | uniq -c | awk '{ if ($1 > 10) print $0}'

Number of unique visitors:

cat access.log | awk '{print $1}' | sort | uniq -c | wc -l

25 most popular URI:

cat access.log | awk '{ print $7 }' | sort | uniq -c | sort -rn | head -n 25

List of unique IPs:

cat access.log | awk '{print $1}' | sort | uniq

List of unique IPs with date and time for each request from them:

cat access.log | awk '{print $1 " " $4}' | sort | uniq

List of unique IPs with date, time and method for each request from them:

cat access.log | awk '{print $1 " " $4 " " $6}' | sort | uniq

List of unique IPs with date, time and URI for each request from them:

cat access.log | awk '{print $1 " " $4 " " $7}' | sort | uniq