开源中文网

您的位置: 首页 > Linux应用 > Nginx > 正文

Linux下nginx日志统计的研究

来源: 网络整理  作者: 佚名

日志分析对于维护服务器来说非常重要,周末时在家没事折腾了一天分析nginx日志,下面是分析日志的研究过程。
是由于看组内大神@沈洲大神做的浏览器统计,一时我也想练练,于是今天一天没有出出(当然还是得吃饭的)~

思路

    首先是写个自动定时任务,在每天夜里23:55的时候运行脚本,脚本做日志切割和转成需要的数据json
    制作访问接口可以获取分析后的json数据,比如浏览器型号,系统型号
    根据接口画出饼状图

定时任务-解析日志

写一个以每天23:55执行的shell,大概内容如下:

# /bin/bash

# 移动数据日志为last
cp -f /home/access.log /home/last.log

# 切割成以当天日期命名的日志
mv /home/access.log /home/access/$(date +%Y%m%d).log

# 触发node分析last日志
node '/home/parseJson.js'

# 解析nginx生成新日志文件
kill -USR1 `cat /var/run/nginx.pid`

parseJson.js的内容大概是使用nginxparser包分析每条的日志,并生成Y-m-d.json存下来

分析日志接口

建立http server,并在访问时处理上面生成的json数据,可以按自己要求分析,比如浏览器版本、浏览器型号、系统版本等,当然还可以做cache~


{"errcode":0,"browser":{"Chrome":{"count":5047,"version":{"5":2,"10":2,"11":22,"12":14,"16":2,"20":1,"21":366,"24":2,"28":3,"29":21,"30":3,"31":198,"32":2,"33":12,"34":40,"35":74,"36":49,"37":18,"38":101,"39":89,"40":2,"41":139,"42":151,"43":113,"44":181,"45":2741,"46":665,"47":21,"48":13}},"IE":{"count":6717,"version":{"5":3,"6":3574,"7":97,"8":717,"9":2169,"10":35,"11":105,"or":17}},"Baidu":{"count":1372,"version":{"spider":1372}},"Firefox":{"count":1368,"version":{"0":4,"1":2,"2":3,"3":60,"4":2,"6":839,"7":22,"10":1,"13":4,"14":14,"15":1,"21":6,"22":4,"24":3,"26":190,"28":9,"29":12,"30":6,"31":16,"34":12,"36":8,"37":18,"38":7,"39":1,"40":26,"41":98}},"Android Browser":{"count":203,"version":{"3":13,"4":190}},"Opera":{"count":37,"version":{"12":22,"28":10,"32":5}},"Mobile Safari":{"count":669,"version":{"4":2,"5":47,"6":111,"7":321,"8":158,"9":30}},"baidu":{"count":38,"version":{"5":2,"6":9,"":24,"boxapp":3}},"WebKit":{"count":36,"version":{"533":2,"534":30,"601":4}},"UCBrowser":{"count":95,"version":{"9":37,"10":58}},"Safari":{"count":60,"version":{"5":21,"7":2,"8":4,"9":33}},"QQBrowser":{"count":21,"version":{"5":21}},"MIUI Browser":{"count":4,"version":{"2":4}},"Mozilla":{"count":2,"version":{"5":2}},"IEMobile":{"count":5,"version":{"9":4,"11":1}},"Edge":{"count":7,"version":{"12":7}},"Chromium":{"count":9,"version":{"14":6,"15":3}},"Maxthon":{"count":17,"version":{"4":17}},"Silk":{"count":3,"version":{"1":3}},"Iceweasel":{"count":6,"version":{"38":6}},"Fennec":{"count":2,"version":{"9":2}}},"os":{"Mac OS":{"count":2753,"version":{"10":2753}},"Windows":{"count":10642,"version":{"7":4718,"8":380,"10":184,"2000":2,"Vista":514,"XP":4814,"NT":13," s":17}},"arch":{"count":2871,"version":{"slurp":1489,"spider":1340,"":14,"bot":28}},"Android":{"count":364,"version":{"2":8,"3":27,"4":321,"5":3,"6":5}},"Linux":{"count":143,"version":{"x86_64":125,"i686":18}},"iOS":{"count":691,"version":{"4":11,"5":43,"6":111,"7":321,"8":158,"9":47}},"Gentoo":{"count":10,"version":{"Firefox":10}},"BlackBerry":{"count":3,"version":{"4":3}},"Windows Phone":{"count":1,"version":{"8":1}},"Ubuntu":{"count":9,"version":{"10":9}},"Windows Phone OS":{"count":4,"version":{"7":4}}},"http_status":{"200":6030,"301":7385,"302":137,"304":1451,"403":814,"404":1672,"405":2},"robot":{"Googlebot":2553,"Yahoo":1489,"bingbot":5702,"Baiduspider":1372,"Blogtrottr":631,"Feedly":222,"HaosouSpider":833,"MJ12bot":2261,"AdsBot-Google-Mobile":44,"AdsBot-Google":44,"Sogou":3250,"YisouSpider":208},"http_bot":{"bot":18609,"all":38010},"time":1446017644780}


Nginx日志统计方案全过程

本文主要记录下给予python的nginx日志统计过程,主要原因是最近系统经常遭到未知程序的疯狂爬数据,虽然做了防爬机制,但是还是必须要找出是哪些IP访问次数比较多。想到的办法就是通过分析ngxin日志,从而找出这些IP排行即可。具体方案的操作步骤包括:

    ngxin日志每日切割功能;
    设置ngxin的日志格式;
    编写python代码在每日切割之前统计下access.log中的IP访问次数并将统计结果录入MongoDB;
    编写web查询MongoDB进行统计。

下面按照每个步骤详细说明。

一、nginx日志每日切割功能

该功能主要通过自己编写shell脚本实现,然后将shell脚本通过crontab设置任务周期。

shell脚本如下:


    #!/bin/bash 
    ## 零点执行该脚本 
    ## Nginx 日志文件所在的目录 
    LOGS_PATH=/usr/local/nginx/logs 
    ## 获取昨天的 yyyy-MM-dd 
    YESTERDAY=$(date -d ”yesterday” +%Y-%m-%d) 
    ## 移动文件 
    mv ${LOGS_PATH}/access.log ${LOGS_PATH}/access_${YESTERDAY}.log 
    ## 向 Nginx 主进程发送 USR1 信号。USR1 信号是重新打开日志文件 
    kill -USR1 $(cat /usr/local/nginx/nginx.pid) 

加入crontab


    0 0 * * * /bin/bash /usr/local/nginx/sbin/cut-log.sh 



二、设置ngxin日志格式

打开nginx.conf配置文件,在server段中加入


    log_format  access  '$remote_addr - $remote_user [$time_local] "$request"' '$status $body_bytes_sent "$http_referer"' '"$http_user_agent" $http_x_forwarded_for'; 
    access_log /usr/local/nginx/logs/access.log access; 



加入成功后重启ngxin


    ./nginx -s reload 



三、编写python代码在每日切割之前统计下access.log中的IP访问次数并将统计结果录入MongoDB;

下载pymongo,上传到服务器,并安装


    # tar zxvf pymongo-1.11.tar.gz 
    # cd pymongo-1.11 
    # python setup.py install 



python连接mongodb样例


    $ cat conn_mongodb.py  
    #!/usr/bin/python 
      
    import pymongo 
    import random 
      
    conn = pymongo.Connection("127.0.0.1",27017) 
    db = conn.tage #连接库 
    db.authenticate("tage","123") 
    #用户认证 
    db.user.drop() 
    #删除集合user 
    db.user.save({'id':1,'name':'kaka','sex':'male'}) 
     #插入一个数据 
    for id in range(2,10): 
        name = random.choice(['steve','koby','owen','tody','rony']) 
        sex = random.choice(['male','female']) 
        db.user.insert({'id':id,'name':name,'sex':sex})  
    #通过循环插入一组数据 
    content = db.user.find() 
    #打印所有数据 
    for i in content: 
        print i 



编写python脚本


    #encoding=utf8 
      
    import re 
      
    zuidaima_nginx_log_path="/usr/local/nginx/logs/www.zuidaima.com.access.log" 
    pattern = re.compile(r'^\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}') 
      
    def stat_ip_views(log_path): 
        ret={} 
        f = open(log_path, "r") 
        for line in f: 
            match = pattern.match(line) 
            if match: 
                ip=match.group(0) 
                if ip in ret: 
                    views=ret[ip] 
                else: 
                    views=0 
                views=views+1 
                ret[ip]=views 
        return ret 
    def run(): 
        ip_views=stat_ip_views(zuidaima_nginx_log_path) 
        max_ip_view={} 
        for ip in ip_views: 
            views=ip_views[ip] 
            if len(max_ip_view)==0: 
                max_ip_view[ip]=views 
            else: 
                _ip=max_ip_view.keys()[0] 
                _views=max_ip_view[_ip] 
                if views>_views: 
                    max_ip_view[ip]=views 
                    max_ip_view.pop(_ip) 
      
            print "ip:", ip, ",views:", views 
        #总共有多少ip 
        print "total:", len(ip_views) 
        #最大访问的ip 
        print "max_ip_view:", max_ip_view 
      
    run() 



上面程序运行结果:


    ip: 221.221.155.53 ,views: 1 
    ip: 221.221.155.54 ,views: 2 
    total: 2 
    max_ip_view: {'221.221.155.54': 2} 

Tags:日志
关于开源中文网 - 联系我们 - 广告服务 - 网站地图 - 版权声明