flume 读取tcp写到hdfs
来源:互联网 发布:lte系统网络架构 编辑:程序博客网 时间:2024/05/22 02:03
# Please paste flume.conf here. Example:# Sources, channels, and sinks are defined per# agent name, in this case 'tier1'.tier1.sources = source1tier1.channels = channel1tier1.sinks = sink1# For each source, channel, and sink, set# standard properties.tier1.sources.source1.type = syslogtcptier1.sources.source1.bind = 127.0.0.1tier1.sources.source1.port = 9999tier1.sources.source1.channels = channel1tier1.channels.channel1.type = memorytier1.sinks.sink1.channel= channel1tier1.sinks.sink1.type=hdfs tier1.sinks.sink1.hdfs.path=hdfs://ha0:8020/flume/events tier1.sinks.sink1.hdfs.fileType=DataStream tier1.sinks.sink1.hdfs.writeFormat=Text tier1.sinks.sink1.hdfs.rollInterval=0 tier1.sinks.sink1.hdfs.rollSize=1024 tier1.sinks.sink1.hdfs.rollCount=0 tier1.sinks.sink1.hdfs.idleTimeout=60 # Other properties are specific to each type of# source, channel, or sink. In this case, we# specify the capacity of the memory channel.tier1.channels.channel1.capacity = 100
测试:
nc 127.0.0.1 9999
nc -l 9999
0 0
- flume 读取tcp写到hdfs
- flume从本地读取数据录入到hdfs文件系统
- Flume:本地文件到HDFS
- flume--03-flume读取web应用某个文件夹下日志到hdfs
- flume采集数据到hdfs
- flume 采集数据到hdfs
- flume 收集日志到HDFS
- hadoop 从mysql中读取数据写到hdfs
- Hive读取Flume正在写入的HDFS
- 使用flume搜集服务器log到hdfs
- 使用flume搜集服务器log到hdfs
- 3、flume数据导入到Hdfs中
- flume上传文件到hdfs上
- flume采集本地数据到hdfs
- flume从kafka导数据到hdfs
- flume的导日志数据到hdfs
- flume监控目录文件到hdfs
- Flume 日志收集、使用Flume收集日志到HDFS
- 10026---forward和redirect的区别
- 11.UISlider
- 数据结构算法之排序系列Java、C源码实现(1)--直接插入排序
- Python的函数参数传递:传值?引用?
- 设计模式之策略模式
- flume 读取tcp写到hdfs
- Java- dos命令下带包运行
- 数据结构之线性表(定长线性表和不定长线性表)
- 在github上维护开源项目的流程
- C 标准库(二)
- 如何解决内存溢出
- java中变量值为NULL的意义
- hdu 1151 Air Raid
- 学习新知识