Flume使用大全之kafka source-kafka channel-hdfs
来源:互联网 发布:淘宝家具店招图片 编辑:程序博客网 时间:2024/06/08 13:39
agent.sources = kafkaSource1
agent.channels = kafkaChannel
agent.sinks = hdfsSink
agent.sources.kafkaSource1.channels = kafkaChannel
agent.sinks.hdfsSink.channel = kafkaChannel
agent.sources.kafkaSource1.type = org.apache.flume.source.kafka.KafkaSource
agent.sources.kafkaSource1.zookeeperConnect = node1:2181
agent.sources.kafkaSource1.topic = bpu_sensor_router,bpu_sensor_record_present,bpu_group_status_present,bpu_gateway_heartbeat,bpu_gateway_router,bpu_sensor_heartbeat
agent.sources.kafkaSource1.consumer.group.id = flume
agent.sources.kafkaSource1.kafka.consumer.timeout.ms = 100
agent.sources.kafkaSource1.kafka.bootstrap.servers = node7:9092
agent.sources.kafkaSource1.batchSize = 100
agent.sources.kafkaSource1.batchDurationMillis = 1000
agent.channels.kafkaChannel.type = org.apache.flume.channel.kafka.KafkaChannel
agent.channels.kafkaChannel.kafka.bootstrap.servers = node7:9092
agent.channels.kafkaChannel.kafka.topic = flume-kafkaChannel
agent.channels.kafkaChannel.consumer.group.id = flume-consumer
#---------hdfsSink 相关配置------------------
agent.sinks.hdfsSink.type = hdfs
# 注意, 我们输出到下面一个子文件夹data中
agent.sinks.hdfsSink.hdfs.path = hdfs://nameservice1/user/hive/warehouse/%{topic}/%Y/%m/%d
agent.sinks.hdfsSink.hdfs.writeFormat = TEXT
agent.sinks.hdfsSink.hdfs.fileType = DataStream
agent.sinks.hdfsSink.hdfs.rollSize = 128000000
agent.sinks.hdfsSink.hdfs.rollInterval=60
agent.sinks.hdfsSink.hdfs.rollCount = 0
agent.sinks.hdfsSink.hdfs.batchSize = 100
agent.sinks.hdfsSink.hdfs.batchDurationMillis = 1000
agent.sinks.hdfsSink.hdfs.round = true
agent.sinks.hdfsSink.hdfs.roundUnit = day
agent.sinks.hdfsSink.hdfs.roundValue = 1
agent.sinks.hdfsSink.hdfs.threadsPoolSize = 25
agent.sinks.hdfsSink.hdfs.useLocalTimeStamp = true
agent.sinks.hdfsSink.hdfs.minBlockReplicas = 1
agent.sinks.hdfsSink.hdfs.idleTimeout = 30
agent.sinks.hdfsSink.hdfs.filePrefix= %{topic}
阅读全文
0 0
- Flume使用大全之kafka source-kafka channel-hdfs
- Flume使用大全之kafka source-kafka channel-hdfs(kerberos认证)
- Flume使用大全之kafka source-kafka channel-hdfs(SSL加密)
- Flume使用大全之kafka source-kafka channel-hdfs(kerberos认证,SSL加密)
- flume之kafka source
- Flume 以twitter为source,kafka为channel,hdfs为sink,再用spark streaming 读kafka topic
- flume+kafka+hdfs详解
- flume+kafka+storm+hdfs
- nginx ---->flume ----->kafka ----> storm -----> hdfs
- flume整合kafka和hdfs
- flume+kafka+hdfs日志系统
- flume+kafka+storm+hdfs整合
- flume+kafka+storm+hdfs整合
- flume+kafka+storm+hdfs整合
- flume学习05---Kafka Source
- flume+kafka使用
- 使用Kafka与Flume
- flume之source,channel,sink
- C++ 函数模板 实例化和具体化
- hdu--6069--Counting Divisors
- hdu1556 Color the ball
- tf.contrib.legacy_seq2seq.basic_rnn_seq2seq 函数 example 最简单实现
- java 根据给定的日期得到给定日期的前一天的日期
- Flume使用大全之kafka source-kafka channel-hdfs
- jquery Ajax请求本地json
- Oracle基础 用户管理
- spring mvc 设置@Scope("prototype")
- 将tensorflow的backend设置为theano的backend
- python爬虫03
- 安装Levenshtein出错
- MS MDS系列之初识MS Master Data Service(微软主数据服务)
- POJ1947