Flume与Kafka集成

来源:互联网 发布:socket bind 任意端口 编辑:程序博客网 时间:2024/06/05 09:51

1、在flume目录中创建flume-kafka-tail-conf.properties文件

# The configuration file needs to define the sources, # the channels and the sinks.# Sources, channels and sinks are defined per agent, # in this case called 'agent'a2.sources = r2a2.channels = c2a2.sinks = k2### define sourcesa2.sources.r2.type = execa2.sources.r2.command = tail -F /opt/datas/spark_word_count.loga2.sources.r2.shell = /bin/bash -c### define channelsa2.channels.c2.type = memorya2.channels.c2.capacity = 1000a2.channels.c2.transactionCapacity = 100### define sinksa2.sinks.k2.type = org.apache.flume.sink.kafka.KafkaSinka2.sinks.k2.brokerList = bigdata.eclipse.com:9092a2.sinks.k2.topic = kafka_topic### bind the sources and sinks to the channelsa2.sources.r2.channels=c2a2.sinks.k2.channel = c2

2、测试

bin/kafka-topics.sh --create --zookeeper bigdata.eclipse.com:2181 --replication-factor 1 --partitions 1 --topic kafka_topicbin/kafka-topics.sh --list --zookeeper bigdata.eclipse.com:2181bin/kafka-console-producer.sh --broker-list bigdata.eclipse.com:9092 --topic kafka_topicbin/kafka-console-consumer.sh --zookeeper bigdata.eclipse.com:2181 --topic kafka_topic --from-beginning
0 0
原创粉丝点击