Flume:本地文件到Kafka
来源:互联网 发布:剑灵女帝捏脸数据 编辑:程序博客网 时间:2024/05/23 02:16
Flume下载地址
官网的配置 : https://flume.apache.org/FlumeUserGuide.html
apache-flume-1.6.0-bin.tar.gz
http://pan.baidu.com/s/1o81nR8e s832
apache-flume-1.5.2-bin.tar.gz
http://pan.baidu.com/s/1bp6tXVL 4n4z
官网
https://flume.apache.org/download.html
配置文件
cd /usr/app/flume1.6/conf
vi flume-dirKakfa.properties
#agent1 nameagent1.sources=source1agent1.sinks=sink1agent1.channels=channel1#Spooling Directory#set source1agent1.sources.source1.type=spooldiragent1.sources.source1.spoolDir=/usr/app/flumelog/dir/logdfsagent1.sources.source1.channels=channel1agent1.sources.source1.fileHeader = falseagent1.sources.source1.interceptors = i1agent1.sources.source1.interceptors.i1.type = timestamp#set sink1agent1.sinks.sink1.type = org.apache.flume.sink.kafka.KafkaSinkagent1.sinks.sink1.topic = Flumelogagent1.sinks.sink1.brokerList = hadoop11:9092,hadoop12:9092agent1.sinks.sink1.requiredAcks = 1agent1.sinks.sink1.batchSize = 100agent1.sinks.sink1.channel = channel1#set channel1agent1.channels.channel1.type=fileagent1.channels.channel1.checkpointDir=/usr/app/flumelog/dir/logdfstmp/pointagent1.channels.channel1.dataDirs=/usr/app/flumelog/dir/logdfstmp
建立Linux目录
[root@hadoop11 app]# mkdir /usr/app/flumelog/dir[root@hadoop11 app]# mkdir /usr/app/flumelog/dir/logdfs[root@hadoop11 app]# mkdir /usr/app/flumelog/dir/logdfstmp[root@hadoop11 app]# mkdir /usr/app/flumelog/dir/logdfstmp/point
建立kakfa的Topicname
bin/kafka-topics.sh --create --topic Flumelog --replication-factor 1 --partitions 2 --zookeeper hadoop11:2181
启动Flume的配置文件
flume-ng agent -n agent1 -c conf -f ./flume-dirKakfa.properties-Dflume.root.logger=DEBUG,console >./flume1.log 2>&1 &
查看监控程序
建立数据测试
启动Kafka消费者,分别在hadoop11和hadoop12
备注
日志监控路径,只要日志放入则被Flume监控
[root@hadoop11 app]# mkdir /usr/app/flumelog/dir/logdfs
日志读取完毕存储路径,日志在这里则一直存储在Channel中(最多只有两个log-number日志,且默认达到1.6G之后删除前面一个log,建立新的log)
[root@hadoop11 app]# mkdir /usr/app/flumelog/dir/logdfstmp
日志监控路径中文件的路径存放点
[root@hadoop11 app]# mkdir /usr/app/flumelog/dir/logdfstmp/point
0 0
- Flume:本地文件到Kafka
- Flume:本地文件到HDFS
- Flume到Kafka
- flume数据传输到kafka
- flume上报日志到kafka
- flume到kafka动态topic
- Kafka实战-Flume到Kafka
- flume到kafka,structuredStreaming从kafka消费
- 从flume到kafka,日志收集
- flume从kafka导数据到hdfs
- flume采集数据到kafka和hive
- flume实时收集日志到kafka
- flume抓取数据到kafka(整合)
- flume采集log4j日志到kafka
- kafka flume生产日志到指定的kafka partition
- flume+kafka
- flume+kafka
- flume kafka
- Geekband007第七周笔记分享
- 用Python3下载网页图片
- c++中的模板
- TimePickerDialog和DatePickerDialog使用及问题解决
- 292.[LeetCode]Nim Game
- Flume:本地文件到Kafka
- Mybatis学习记录(三)--Mybatis配置文件详解
- activity与fragment通信汇总
- HDU 4622 Reincarnation 后缀自动机
- Intent详解
- (LeetCode 191) Number of 1 Bits
- CLM研究
- [POJ 3276] Face The Right Way (翻转问题+技巧)
- Error:Configuration with name 'default' not found. 解决办法