flume采集数据导入elasticsearch 配置
来源:互联网 发布:西安java招聘 编辑:程序博客网 时间:2024/05/29 11:40
Flume启动一般会报两种错,一种是log4j没有配置,另外一种就是缺少各种jar包。SO:
[root@laiym ~]# cp /usr/local/elasticsearch/lib/*/usr/local/flume/lib/
如果有相同的jar包不用覆盖
下述为flume到elasticsearch的一个配置文件,字段用法详情大家看官方给出的定义。
#文件名称为flume-es.conf
#定义sources,channel和sinks的名称
agent.sources = tail
agent.sinks = elasticsearch
agent.channels = memoryChannel
#配置source的详情
agent.sources.tail.type = exec
agent.sources.tail.command = tail -F /var/log/secure
agent.sources.tail.interceptors=i1 i2 i3
agent.sources.tail.interceptors.i1.type=regex_extractor
agent.sources.tail.interceptors.i1.regex =(\\w+\\s+\\w+\\s+\\d+\\\:\\d+\\\:\\d+)\\s+(\\w+)\\s+(\\w+)
agent.sources.tail.interceptors.i1.serializers = s1 s2s3
agent.sources.tail.interceptors.i1.serializers.s1.name= time
agent.sources.tail.interceptors.i1.serializers.s2.name= hostname
agent.sources.tail.interceptors.i1.serializers.s3.name= service
agent.sources.tail.interceptors.i2.type=org.apache.flume.interceptor.TimestampInterceptor$Builder
agent.sources.tail.interceptors.i3.type=org.apache.flume.interceptor.HostInterceptor$Builder
agent.sources.tail.interceptors.i3.hostHeader = host
#配置channel的详情
agent.channels.memoryChannel.type = memory
agentes.channels.channel1.capacity = 1000000
agentes.channels.channel1.transactionCapacity = 5000
#agentes.channels.channel1.keep-alive = 10
#配置sink的详情
agent.sinks.elasticsearch.type=org.apache.flume.sink.elasticsearch.ElasticSearchSink
agent.sinks.elasticsearch.batchSize=100
agent.sinks.elasticsearch.hostNames=127.0.0.1:9300
agent.sinks.elasticsearch.indexName=linux_secure
agent.sinks.elasticsearch.indexType=message
agent.sinks.elasticsearch.clusterName=elasticsearch
agent.sinks.elasticsearch.serializer=org.apache.flume.sink.elasticsearch.ElasticSearchLogStashEventSerializer
#配置source、sink和channel的详情
agent.sources.tail.channels = memoryChannel
agent.sinks.elasticsearch.channel = memoryChannel
样本日志为linux的secure日志。
Feb 23 17:38:20 laiym sshd[1591]:pam_unix(sshd:session): session closed for user root
Feb 23 17:38:20 laiym sshd[1616]:pam_unix(sshd:session): session closed for user root
Feb 23 17:38:38 laiym sshd[1954]: reverse mappingchecking getaddrinfo for bogon [192.168.141.1] failed - POSSIBLE BREAK-INATTEMPT!
Feb 23 17:38:38 laiym sshd[1954]: Accepted passwordfor root from 192.168.141.1 port 61857 ssh2
Feb 23 17:38:38 laiym sshd[1954]:pam_unix(sshd:session): session opened for user root by (uid=0)
Feb 23 17:50:19 laiym sshd[2019]: reverse mappingchecking getaddrinfo for bogon [192.168.141.1] failed - POSSIBLE BREAK-INATTEMPT!
Feb 23 17:50:19 laiym sshd[2019]: Accepted passwordfor root from 192.168.141.1 port 50289 ssh2
Feb 23 17:50:20 laiym sshd[2019]:pam_unix(sshd:session): session opened for user root by (uid=0)
Feb 24 09:40:51 laiym sshd[1585]:pam_unix(sshd:session): session closed for user root
启动时打开INFO日志和console日志,查看启动状态。
[root@laiym ~]# cd /usr/local/flume/
[root@laiym flume]# ./bin/flume-ng agent -c ./conf/ -f./conf/flume-es.conf -n agent -Dflume.root.logger=INFO,console
在ES中的数据截图:
在kibana中的数据截图:
ok,完美!!!
- flume采集数据导入elasticsearch 配置
- Flume采集数据发送到elasticsearch 2.2上
- Flume数据采集各种配置详解
- flume 日志导入elasticsearch
- Flume实时采集数据
- flume数据采集
- flume采集数据到hdfs
- flume 采集数据到hdfs
- Hive采集数据框架flume
- Sqoop Flume 数据采集引擎
- Flume采集rsyslog日志并发送到elasticsearch上
- ElasticSearch数据导入Hive
- elasticsearch数据导入
- elasticsearch数据导入导出
- Elasticsearch 批量导入数据
- elasticsearch 批量导入数据
- Elasticsearch数据导入
- elasticsearch数据导出/导入
- 欢迎使用CSDN-markdown编辑器
- 数据分析-主成分分析
- .svg 中如何添加注释
- 常规阻抗控制只能是10%的偏差(一)
- UILabel属性集合
- flume采集数据导入elasticsearch 配置
- iOS个人整理19-UITableViewController和UITableView的编辑
- secneuqesbuStcnitsiD.115
- プロポーズ大作戦 求婚大作战 台词
- jquery 图片添加标注 点击标注弹窗
- mycat 离散分片(枚举分片)
- HDU 5610 Baby Ming and Weight lifting(简单的思维题目)
- (一) 了解 Ruby on Rails
- Android Launcher 解决BubbleTextView 点击事件只在ICON上面触发,防止误触