storm流程——flume和kafka的连接

来源:互联网 发布:福州网络推广 编辑:程序博客网 时间:2024/06/06 16:39

flume和kafka的连接参考博客:flume,kafka,storm,mysql的整合
相关资源在这flume2kafka相关jar包及配置文件
若想连接起flume和kafka,需要在flume/conf目录下,创建一个.conf文件,在lib目录下添加相关jar包。
步骤:
1.在flume/conf目录下创建相关.conf文件,
(1)创建flume2kafka.conf文件

vi flume2kafka.conf

(2)在flume2kafka.conf文件中添加相关内容

#############################################  producer config############################################agent sectionproducer.sources = sproducer.channels = cproducer.sinks = r#source section#设置读取方式producer.sources.s.type = exec#设置读取数据的地址及命令(tail -f -n+1 /home/storm/work/access.log)producer.sources.s.command = tail -f -n+1 /home/storm/work/access.logproducer.sources.s.channels = c# Each sink's type must be definedproducer.sinks.r.type = org.apache.flume.plugins.KafkaSinkproducer.sinks.r.metadata.broker.list=master:9092producer.sinks.r.partition.key=0producer.sinks.r.partitioner.class=org.apache.flume.plugins.SinglePartitionproducer.sinks.r.serializer.class=kafka.serializer.StringEncoderproducer.sinks.r.request.required.acks=0producer.sinks.r.max.message.size=1000000producer.sinks.r.producer.type=syncproducer.sinks.r.custom.encoding=UTF-8#设置kafka的topic为:flume2kafkaproducer.sinks.r.custom.topic.name=flume2kafka#Specify the channel the sink should useproducer.sinks.r.channel = c# Each channel's type is defined.producer.channels.c.type = memoryproducer.channels.c.capacity = 1000

2.在lib目录下添加相关jar包

kafka_2.9.2-0.8.0-beta1.jarmetrics-annotation-2.2.0.jarscala-compiler-2.9.2.jar

3.启动该flume任务

bin/flume-ng agent -n producer -f conf/flume2kafka.conf  -Dflume.root.logger=INFO,console >>logs/flume.log 2>&1 &

4.启动kafka及kafka的consumer任务(查看是否有数据传输)
(1)启动kafka

sbin/start-kafka.sh

(2)启动consumer任务

bin/kafka-console-consumer.sh --zookeeper master:2181 --topic flume2kafka --from-beginning
0 0
原创粉丝点击