hadoop pipes编程示例
来源:互联网 发布:西安淘宝模特兼职 编辑:程序博客网 时间:2024/06/05 20:57
hadooppipes是hadoop的c++正式接口,通过socket与Map/Reduce框架通信,具体原理这里不在详述,下面通过一个单词统计的示例来说明用法。
1.代码
#include "hadoop/Pipes.hh"
#include "hadoop/TemplateFactory.hh"
#include "hadoop/StringUtils.hh"
const std::string WORDCOUNT = "WORDCOUNT";
const std::string INPUT_WORDS = "INPUT_WORDS";
const std::string OUTPUT_WORDS = "OUTPUT_WORDS";
class WordCountMap: public HadoopPipes::Mapper {
public:
};
class WordCountReduce: public HadoopPipes::Reducer {
public:
};
int main(int argc, char *argv[]) {
}
2.编译
makefile如下:
CC = g++
HADOOP_INSTALL = /home/keke/hadoop-0.20.2-cdh3u4
PLATFORM = Linux-i386-32
CPPFLAGS = -m32 -I$(HADOOP_INSTALL)/c++/$(PLATFORM)/include
wordcount:wordcount.cpp
3.运行
先将只执行文件复制到HDFS上面,例如放在HDFS的bin下
执行:
hadoop pipes -D hadoop.pipes.java.recordreader=true -Dhadoop.pipes.java.recordwrite=true -input /user/keke/input -outputoutput -program bin/wordcount
- hadoop pipes编程示例
- Hadoop pipes编程
- hadoop pipes
- Hadoop pipes编程
- Hadoop pipes编程
- Hadoop pipes编程
- Hadoop pipes编程
- Hadoop pipes设计原理
- Hadoop Streaming和Pipes
- hadoop c++ pipes接口实现
- hadoop安装和hadoop pipes编程说明
- Hadoop Pipes编程之C++实现WordCount
- hadoop pipes 原理及编程实践
- Hadoop Pipes
- Hadoop pipes
- Hadoop Streaming 编程
- Hadoop pipes设计原理
- Hadoop pipes设计原理
- bulk-load装载hdfs数据到hbase
- Hbse源码分析-HFileOutputFo…
- HADOOP_CLASSPATH设置
- hadoop生态链资源
- Hadoop Streaming和Pipes
- hadoop pipes编程示例
- hadoop c++ pipes接口实现
- 流式计算之Storm简介
- MapR初探
- ACL-NLP顶级会议
- Hadoop发展现状乱而稳定的解读
- hadoop pipes
- hadoop常见配置含义
- 我的博客今天2岁319天了,我领取了…