hadoop学习(五)------源码编译

来源:互联网 发布:迦太基 汉尼拔 知乎 编辑:程序博客网 时间:2024/04/30 12:57

编译环境准备:

在ubuntukylin-14.04-desktop-i386环境中编译hadoop源码


1.安装JDK,maven

请参考http://blog.csdn.net/happyanger6/article/details/45205877


2.编译安装hadoop依赖

protobuf2.5.0(必须使用这个版本,hadoop2.6.0源码要求此版本)


github地址:

https://github.com/google/protobuf/tree/v2.5.0


下载后解压

unzip  protobuf-2.5.0.zip


进入编译目录cd protobuf-2.5.0

./autogen.sh执行这个脚本生成.configure编译环境配置文件

./configure --prefix=/usr 生成编译环境

./make 编译

./make install 安装

编译安装完成后执行protoc --version验证安装

libprotoc 2.5.0

我们还需要将其打包成jar包并安装到本地mvn仓库,使hadoop使用,执行

cd java

mvn install

mvn package


注意:如果./autogen.sh过程中下载gtest相关报错,或者网络连接不上,可以注释掉脚本中关于gtest的相关项。

另外,还需可能还要安装autoreconf等编译工具。(apt-get install gcc g++ make maven cmake zlib zlib1g-dev libcurl4-openssl-dev automake libtool)


3.下载编译hadoop源码

http://apache.dataguru.cn/hadoop/common/hadoop-2.6.0/

解压并进入源码目录

执行

mvn clean -DskipTests package

由于编译过程中maven要下载hadoop的许多依赖jar包,所以时间比较久,最后编译成功.

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................ SUCCESS [33.268s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [1.625s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [0.461s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [0.289s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.111s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [0.988s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [0.486s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [0.519s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [0.928s]
[INFO] Apache Hadoop Common .............................. SUCCESS [8.206s]
[INFO] Apache Hadoop NFS ................................. SUCCESS [0.153s]
[INFO] Apache Hadoop KMS ................................. SUCCESS [2.372s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.042s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [16.146s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [2.413s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [0.336s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [0.125s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.045s]
[INFO] hadoop-yarn ....................................... SUCCESS [0.039s]
[INFO] hadoop-yarn-api ................................... SUCCESS [3.889s]
[INFO] hadoop-yarn-common ................................ SUCCESS [0.954s]
[INFO] hadoop-yarn-server ................................ SUCCESS [0.040s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [0.625s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [0.668s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [0.144s]
[INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [0.177s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [0.955s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [0.382s]
[INFO] hadoop-yarn-client ................................ SUCCESS [0.277s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.039s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [0.153s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [0.112s]
[INFO] hadoop-yarn-site .................................. SUCCESS [0.038s]
[INFO] hadoop-yarn-registry .............................. SUCCESS [0.256s]
[INFO] hadoop-yarn-project ............................... SUCCESS [0.096s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.062s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [0.510s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [1.139s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [0.238s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [0.261s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [0.427s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [0.294s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [0.130s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [0.149s]
[INFO] hadoop-mapreduce .................................. SUCCESS [0.086s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [0.200s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [0.719s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [0.165s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [0.365s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [0.174s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [0.139s]
[INFO] Apache Hadoop Ant Tasks ........................... SUCCESS [0.097s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [0.213s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [0.033s]
[INFO] Apache Hadoop OpenStack support ................... SUCCESS [0.205s]
[INFO] Apache Hadoop Amazon Web Services support ......... SUCCESS [0.135s]
[INFO] Apache Hadoop Client .............................. SUCCESS [0.265s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.118s]
[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [0.251s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [0.278s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [0.041s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [0.104s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:25.293s
[INFO] Finished at: Sat Apr 25 18:16:26 PDT 2015
[INFO] Final Memory: 86M/247M
[INFO] -----------------------------------------------------------------------



如果编译过程中有报错,欢迎共同研究。



0 0