Kafka整合Flume
来源:互联网 发布:sql outer 编辑:程序博客网 时间:2024/05/18 19:36
Kafka与flume
1)准备jar包
1、将Kafka主目录lib下的如下jar拷贝至Flume的lib目录下
kafka_2.10-0.8.2.1.jar、kafka-clients-0.8.2.1.jar、jopt-simple-3.2.jar、metrics-core-2.2.0.jar、scala-library-2.10.4.jar、zkclient-0.3.jar等
2、将如下jar拷贝至flume主目录下,上述1是其依赖的jar包
下载flume、kafka插件包,flumeng-kafka-plugin.jar
2)配置flume.conf文件,如下
#agent section
producer.sources = s
producer.channels = c
producer.sinks = r
#source section
#producer.sources.s.type = seq
producer.sources.s.channels = c
producer.sources.s.type = exec
producer.sources.s.command = tail -fn 1 /letv/logs/test.log
# Each sink's type must be defined
producer.sinks.r.type = org.apache.flume.plugins.KafkaSink
producer.sinks.r.metadata.broker.list=10.148.13.10:9092,10.148.13.11:9092,10.148.13.12:9092,10.148.13.13:9092,10.148.13.14:9092,10.148.13.15:9092,10.148.13.16:9092,10.148.13.17:9092,10.148.13.18:9092,10.148.13.19:9092
#producer.sinks.r.partition.key=0
#producer.sinks.r.partitioner.class=org.apache.flume.plugins.SinglePartition
producer.sinks.r.serializer.class=kafka.serializer.StringEncoder
producer.sinks.r.request.required.acks=-1
producer.sinks.r.max.message.size=1000000
producer.sinks.r.producer.type=sync
producer.sinks.r.custom.encoding=UTF-8
producer.sinks.r.custom.topic.name=test-topic
#Specify the channel the sink should use
producer.sinks.r.channel = c
# Each channel's type is defined.
producer.channels.c.type = memory
producer.channels.c.capacity = 1000000
producer.channels.c.transactionCapacity = 1000000
测试:
- 启动zookeeper服务,kafka依赖的组件
- 启动kafka服务,同时创建topic名为test-topic
./kafka-topics.sh --create --zookeeper master-active:2181 --replication-factor 1 --partitions 1 --topic test-topic
- 启动Flume服务
./flume-ng agent -n agent -f ../conf/flume-kafka.conf -Dflume.root.logger=INFO,console
- 使用echo “hello world , kafka and flume !” >>/letv/logs/test.log
- 启动kafka的consumer即可查看输入文件流
./kafka-console-consumer.sh --zookeeper master-active:2181 --topictest-topic--from-beginning
- Flume+kafka+storm整合
- Flume与Kafka整合
- Kafka flume 整合
- Kafka flume 整合
- Flume+Kafka+SparkStreaming整合
- flume+kafka+storm整合
- 103-flume整合kafka
- Flume+Kafka整合
- flume整合kafka
- Flume与Kafka整合
- Kafka整合Flume
- kafka和flume整合
- Flume整合kafka
- Flume+Kafka+SparkStreaming整合
- Flume和Kafka整合
- flume+kafka+storm整合
- Flume与Kafka整合
- flume与kafka整合
- iOS开发 AFNetworking 3.0使用遇到的问题补充
- 190. Reverse Bits
- 原生js设置,获取,删除cookie demo
- Linux Samba 配置(Ubuntu)
- play silhoutte
- Kafka整合Flume
- org.apache.commons.httpclient.HttpClient的使用
- 自定义navigationBar [转]
- appscan使用方法
- 【lua】table是否为空的判断
- 浮点转定点运算
- windows 自带线程池
- “阻塞”与"非阻塞"与"同步"与“异步
- 上传图片(来源:相机和相册)