Flume把网络数据写入HDFS
来源:互联网 发布:数据销毁设备 编辑:程序博客网 时间:2024/06/06 12:29
1.创建agent配置文件
把下列内容存入agent4.conf,并保存到Flume的工作目录/opt/flume/bin下面
agent4.sources = netsource
agent4.sinks = hdfssink
agent4.channels = memorychannel
agent4.sources.netsource.type = netcat
agent4.sources.netsource.bind = localhost
agent4.sources.netsource.port = 3000
agent4.sinks.hdfssink.type = hdfs
agent4.sinks.hdfssink.hdfs.path = /flume
agent4.sinks.hdfssink.hdfs.filePrefix = log
agent4.sinks.hdfssink..hdfs.rollInterval = 0
agent4.sinks.hdfssink.hdfs.rollCount = 3
agent4.sinks.hdfssink.hdfs.fileType = DataStream
agent4.channels.memorychannel.type = memory
agent4.channels.memorychannel.capacity = 1000
agent4.channels.memorychannel.transactionCapacity = 100
agent4.sources.netsource.channels = memorychannel
agent4.sinks.hdfssink.channel = memorychannel
配置说明:使用netcat信源和HDFS信宿,指定信宿文件存于HDFS的/flume目录,每一个文件都以log为前缀,每一个文件最多只能存储三条数据。
2.启动Flume代理
caiyong@caiyong:/opt/flume/bin$ flume-ng agent --conf conf --conf-file agent4.conf --name agent4
3.在另一个窗口中开启一个远程连接并发送几个事件
caiyong@caiyong:~$ telnet localhost 3000
Trying 127.0.0.1...
Connected to localhost.
Escape character is '^]'.
write
OK
data
OK
based
OK
on
OK
network
OK
to
OK
HDFS
OK
4.检查结果
caiyong@caiyong:/opt/hadoop$ bin/hadoop fs -ls /
Found 7 items
drwxr-xr-x - caiyong supergroup 0 2015-03-14 14:46 /flume
drwxr-xr-x - caiyong supergroup 0 2015-03-05 14:51 /hbase
drwxr-xr-x - caiyong supergroup 0 2015-03-14 13:07 /home
drwxr-xr-x - caiyong supergroup 0 2015-03-07 16:03 /pig
drwxr-xr-x - caiyong supergroup 0 2015-03-11 19:12 /testcopy
drwxr-xr-x - caiyong supergroup 0 2015-03-14 08:39 /tmp
drwxr-xr-x - caiyong supergroup 0 2015-03-11 19:04 /user
caiyong@caiyong:/opt/hadoop$ bin/hadoop fs -ls /flume/
Found 3 items
-rw-r--r-- 1 caiyong supergroup 20 2015-03-14 14:45 /flume/log.1426315528974
-rw-r--r-- 1 caiyong supergroup 17 2015-03-14 14:45 /flume/log.1426315528975
-rw-r--r-- 1 caiyong supergroup 6 2015-03-14 14:46 /flume/log.1426315528976
caiyong@caiyong:/opt/hadoop$ bin/hadoop fs -cat /flume/*
write
data
based
on
network
to
HDFS
caiyong@caiyong:/opt/hadoop$ bin/hadoop fs -cat /flume/log*4
write
data
based
caiyong@caiyong:/opt/hadoop$ bin/hadoop fs -cat /flume/log*5
on
network
to
caiyong@caiyong:/opt/hadoop$ bin/hadoop fs -cat /flume/log*6
HDFS
把下列内容存入agent4.conf,并保存到Flume的工作目录/opt/flume/bin下面
agent4.sources = netsource
agent4.sinks = hdfssink
agent4.channels = memorychannel
agent4.sources.netsource.type = netcat
agent4.sources.netsource.bind = localhost
agent4.sources.netsource.port = 3000
agent4.sinks.hdfssink.type = hdfs
agent4.sinks.hdfssink.hdfs.path = /flume
agent4.sinks.hdfssink.hdfs.filePrefix = log
agent4.sinks.hdfssink..hdfs.rollInterval = 0
agent4.sinks.hdfssink.hdfs.rollCount = 3
agent4.sinks.hdfssink.hdfs.fileType = DataStream
agent4.channels.memorychannel.type = memory
agent4.channels.memorychannel.capacity = 1000
agent4.channels.memorychannel.transactionCapacity = 100
agent4.sources.netsource.channels = memorychannel
agent4.sinks.hdfssink.channel = memorychannel
配置说明:使用netcat信源和HDFS信宿,指定信宿文件存于HDFS的/flume目录,每一个文件都以log为前缀,每一个文件最多只能存储三条数据。
2.启动Flume代理
caiyong@caiyong:/opt/flume/bin$ flume-ng agent --conf conf --conf-file agent4.conf --name agent4
3.在另一个窗口中开启一个远程连接并发送几个事件
caiyong@caiyong:~$ telnet localhost 3000
Trying 127.0.0.1...
Connected to localhost.
Escape character is '^]'.
write
OK
data
OK
based
OK
on
OK
network
OK
to
OK
HDFS
OK
4.检查结果
caiyong@caiyong:/opt/hadoop$ bin/hadoop fs -ls /
Found 7 items
drwxr-xr-x - caiyong supergroup 0 2015-03-14 14:46 /flume
drwxr-xr-x - caiyong supergroup 0 2015-03-05 14:51 /hbase
drwxr-xr-x - caiyong supergroup 0 2015-03-14 13:07 /home
drwxr-xr-x - caiyong supergroup 0 2015-03-07 16:03 /pig
drwxr-xr-x - caiyong supergroup 0 2015-03-11 19:12 /testcopy
drwxr-xr-x - caiyong supergroup 0 2015-03-14 08:39 /tmp
drwxr-xr-x - caiyong supergroup 0 2015-03-11 19:04 /user
caiyong@caiyong:/opt/hadoop$ bin/hadoop fs -ls /flume/
Found 3 items
-rw-r--r-- 1 caiyong supergroup 20 2015-03-14 14:45 /flume/log.1426315528974
-rw-r--r-- 1 caiyong supergroup 17 2015-03-14 14:45 /flume/log.1426315528975
-rw-r--r-- 1 caiyong supergroup 6 2015-03-14 14:46 /flume/log.1426315528976
caiyong@caiyong:/opt/hadoop$ bin/hadoop fs -cat /flume/*
write
data
based
on
network
to
HDFS
caiyong@caiyong:/opt/hadoop$ bin/hadoop fs -cat /flume/log*4
write
data
based
caiyong@caiyong:/opt/hadoop$ bin/hadoop fs -cat /flume/log*5
on
network
to
caiyong@caiyong:/opt/hadoop$ bin/hadoop fs -cat /flume/log*6
HDFS
0 0
- Flume把网络数据写入HDFS
- flume将log4j日志数据写入到hdfs
- flume 抽取图片文件数据写入到HDFS
- flume学习(三):flume将log4j日志数据写入到hdfs
- flume学习(五):flume将log4j日志数据写入到hdfs
- flume学习(五):flume将log4j日志数据写入到hdfs
- flume学习(二):flume将log4j日志数据写入到hdfs
- flume学习(三):flume将log4j日志数据写入到hdfs
- 模拟使用Flume监听日志变化,并且把增量的日志文件写入到hdfs中
- Flume把事件写入多个信宿
- 用把数据从hdfs写入到mysql
- 用MapReduce把hdfs数据写入HBase中
- flume 写入hdfs 采用lzo 格式 教程
- Hive读取Flume正在写入的HDFS
- flume采集数据到hdfs
- flume 采集数据到hdfs
- HDFS的数据写入过程
- flume写入hadoop hdfs报错 Too many open files
- 苗家胃酒针对胃疼胃痛的病患者具有神奇的疗效
- android 百度地图轨迹回放
- 温故知新--Servlet(一)servlet接口学习
- 阿里云主机安卓apache 时遇到错误
- 【Unity】Mesh网格编程(三)万能网格几何形体
- Flume把网络数据写入HDFS
- 苗家胃酒专治胃疼胃寒的苗家药方具有神奇的疗效
- ActionContext和ServletActionContext的区别
- Socket编程实践(9) --套接字IO超时设置方法
- Unity3D手动实现UV动画教程
- 胃病,胃寒,等患者的福音 苗家胃酒彻底根治,现免费赠送
- 二级 求最高分人数
- php冒泡排序法
- pandas擅长数据I/O处理numpy擅长数组数值计算