用shell脚本分析Nginx日志
作者:网络转载 发布时间:[ 2013/12/20 11:40:38 ] 推荐标签:
本文将介绍用shell脚本来分析Nginx负载均衡器的日志,这样可以快速得出排名靠前的网站和IP等,推荐大家使用线上环境下的shell脚本。本文中的shell脚本又分为两种情况,第一种情况是Nginx作为前端的负载均衡器,其集群架构为Nginx+Keepalived时,脚本内容如下所示:
-
vim log-nginx.sh
-
#!/bin/bash
-
-
if [$# -eq 0 ]; then
-
echo "Error: please specify logfile."
-
exit 0
-
else
-
LOG=$1
-
fi
-
-
if [ ! -f$1 ]; then
-
echo "Sorry, sir, I can't find this apache log file, pls try again!"
-
exit 0
-
fi
-
-
####################################################
-
echo "Most of the ip:"
-
echo "-------------------------------------------"
-
awk '{ print$1 }'$LOG| sort| uniq -c| sort -nr| head -10
-
echo
-
echo
-
####################################################
-
echo "Most of the time:"
-
echo "--------------------------------------------"
-
awk '{ print$4 }'$LOG| cut -c 14-18| sort| uniq -c| sort -nr| head -10
-
echo
-
echo
-
####################################################
-
echo "Most of the page:"
-
echo "--------------------------------------------"
-
awk '{print$11}'$LOG| sed 's/^.*\(.cn*\)"/\1/g'| sort| uniq -c| sort -rn| head -10
-
echo
-
echo
-
####################################################
-
echo "Most of the time / Most of the ip:"
-
echo "--------------------------------------------"
-
awk '{ print$4 }'$LOG| cut -c 14-18| sort -n| uniq -c| sort -nr| head -10 > timelog
-
-
for i in 'awk '{ print$2 }' timelog'
-
do
-
num='grep$i timelog| awk '{ print$1 }''
-
echo "$i$num"
-
ip='grep$i$LOG| awk '{ print$1}'| sort -n| uniq -c| sort -nr| head -10'
-
echo "$ip"
-
echo
-
done
-
rm -f timelog
|
本文内容不用于商业目的,如涉及知识产权问题,请权利人联系SPASVO小编(021-61079698-8054),我们将立即处理,马上删除。