flume和kafka对接

来源:互联网 发布:赤兔cms 编辑:程序博客网 时间:2024/05/09 03:55

问题
flume采集日志用kafka来广播消息

flume的配置

[root@SZB-L0032016 bin]# cat ../conf/flume_kafka.conf a.sinks=k1a.sources=s1 s2a.channels=r1#定义source是从文本文件过来a.sources.s1.type=execa.sources.s1.command=tail -F /root/a.loga.sources.s2.type=execa.sources.s2.command=tail -F /root/b.log#sink 是kafka topic 是test broker 是10.20.25.199:9092a.sinks.k1.type=org.apache.flume.sink.kafka.KafkaSinka.sinks.k1.kafka.topic = testa.sinks.k1.kafka.bootstrap.servers =10.20.25.199:9092a.sinks.k1.kafka.flumeBatchSize = 20a.sinks.k1.kafka.producer.acks = 1a.sinks.k1.kafka.producer.linger.ms = 1a.sinks.ki.kafka.producer.compression.type = snappya.channels.r1.type=filea.channels.r1.checkpointDir=/root/flume/checkpointa.channels.r1.dataDirs=/root/flume/dataa.sources.s2.channels=r1a.sources.s1.channels=r1a.sinks.k1.channel=r1

启动flume服务

[root@SZB-L0032016 bin]# ./flume-ng agent --conf conf --conf-file ../conf/flume_kafka.conf --name a -Dflume.root.logger=INFO,console

往flume的source里面增加内容

[root@SZB-L0032016 ~]# echo "test">a.log [root@SZB-L0032016 ~]# echo "test11111">a.log 

启动kafka的消费端

[xulu@SZB-L0032015 bin]$ ./kafka-console-consumer.sh --zookeeper 10.20.25.241:3181 --topic test --from-beginningthis is messagethis is aaaclearrsssthis is the kafka info1111ddddddddtest big datatesttest11111

可以在kafka消费端查看到flume发送过来的消息

0 0
原创粉丝点击