LogStash实践日志分析二:收集数据、入库、数据分析和kibana展示

来源:互联网 发布:js中不等于空 编辑:程序博客网 时间:2024/04/29 23:52

1、原始数据源服务器,从日志拉取数据,并转化为utf8编码,增加ip地址,保存到redis中,上配置如下:

input {    file {        path => "/tmp/activityserver.log"        codec => json {            charset => "GBK"        }        start_position => "beginning"        sincedb_path => "/dev/null"        type => "activitysun"    }}filter {    date {        match => ["timestamp", "UNIX"]        remove_field => ["time"]    }    #ruby {    #    code => "event.timestamp.time.localtime"    #}    mutate {        #some pc no hostreplace => { "host" => "192.168.1.11" }    }}output {    #stdout {    #    codec => plain {    #        charset => "UTF-8"    #    }    #}    file {        path => "/tmp/logstash.log"        codec => json {            charset => "UTF-8"        }    }    redis {        host => ["192.168.1.18"]        port => 26379        data_type => "list"        key => "activityserver"        codec => json {            charset => "UTF-8"        }    }}

采用如下命令启动

/data/logstash-2.3.4/bin/logstash -f activitylog.conf

2、收集服务器,从redis拉取数据,放入elasticsearch,时间做一下处理,方便分析。配置如下:

input {  redis {    host => ["192.168.1.18"]    port => 26379    data_type => "list"    key => "activityserver"    codec => json {      charset => "UTF-8"    }    #type => "activitysun"  }}filter {    ruby {        code=>"event['daytag']=event.timestamp.time.localtime.strftime('%Y.%m.%d')"    }}output {  elasticsearch {    hosts => ["127.0.0.1:19200"]    index => "%{type}-%{daytag}"    #index => "%{type}-%{+yyyy.MM.dd}"  }}
3、kibana展示




0 0
原创粉丝点击