本地编译Hadoop2.7.1源码总结和问题解决
来源:互联网 发布:对位算法 编辑:程序博客网 时间:2024/06/07 03:17
先吐槽一下,本人编译了3天,本来想放弃了,在晚上的时候尝试了最后一次,没想到终于成功了,这里分享一下编译的过程、遇到的问题以及相应的解决办法,以供接下来学习的人查阅。
编译准备
1、下载所需的软件
先去官网下载hadoop2.7.1源码并解压,打开解压目录下的BUILDING.txt,编译过程和需要的软件其实就是根据这个文档里的描述来的。
Requirements:* Unix System* JDK 1.7+* Maven 3.0 or later* Findbugs 1.3.9 (if running findbugs)* ProtocolBuffer 2.5.0* CMake 2.6 or newer (if compiling native code), must be 3.0 or newer on Mac* Zlib devel (if compiling native code)* openssl devel ( if compiling native hadoop-pipes and to get the best HDFS encryption performance )* Jansson C XML parsing library ( if compiling libwebhdfs )* Linux FUSE (Filesystem in Userspace) version 2.6 or above ( if compiling fuse_dfs )* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
这是编译所需要的软件,下载戳这里。
包括:
- JDK1.7+
- maven 3.0 or later
- findbugs 1.3.9
- protocolBuffer 2.5.0
- cmake 2.6
- zlib-devel
- openssl-devel
根据网友的资料,也需要安装autoconf automake gcc等。
2、安装软件
1> 安装JDK1.7并配置环境变量,这里就不赘述了,具体看前面的文档。
2> 安装各种库
yum -y install svn ncurses-devel gcc*yum -y install lzo-devel zlib-devel autoconf automake libtool cmake openssl-devel
3> 安装maven
解压安装并配置环境变量,具体目录结构看前面的博客。
tar zxvf apache-maven-3.3.9-bin.tar.gzmv apache-maven-3.3.9 ../labc/vi /etc/profile
在profile
文件末尾追加
export MAVEN_HOME=/home/hadoop/labc/apache-maven-3.3.9export MAVEN_OPTS="-Xms256m -Xmx512m"export PATH=$PATH:$MAVEN_HOME/bin
保存并使环境变量生效,source /etc/profile
,输入mvn -version
,有下面输出结果则安装并配置正确。
[hadoop@Master ~]$ mvn -versionApache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-11T00:41:47+08:00)Maven home: /home/hadoop/labc/apache-maven-3.3.9Java version: 1.7.0_67, vendor: Oracle CorporationJava home: /usr/java/jdk1.7.0_67/jreDefault locale: zh_CN, platform encoding: UTF-8OS name: "linux", version: "2.6.32-71.el6.i686", arch: "i386", family: "unix"[hadoop@Master ~]$
4> 安装protocolBuffer
解压安装并配置环境变量。
tar zxvf protobuf-2.5.0.tar.gzmv protobuf-2.5.0 ../labc/cd ../labc/protobuf-2.5.0./configuremakemake install
输入protoc --version
,有下面输出结果则安装并配置正确。
[hadoop@Master ~]$ protoc --versionlibprotoc 2.5.0[hadoop@Master ~]$
5> 安装findbugs
解压安装并配置环境变量。
unzip findbugs-1.3.9.zipmv findbugs-1.3.9 ../labc/vi /etc/profile
在profile
文件末尾追加
export FINDBUGS_HOME=/home/hadoop/labc/findbugs-1.3.9export PATH=$PATH:$FINDBUGS_HOME/bin
保存并使环境变量生效,source /etc/profile
,输入findbugs -version
,有下面输出结果则安装并配置正确。
[hadoop@Master ~]$ findbugs -version1.3.9[hadoop@Master ~]$
开始编译
首先保证主机能上网(虚拟机怎么上网的点这里),在编译过程中网络保持畅通;进入到hadoop2.7.1源码的解压目录下,输入下面命令:
mvn package -Pdist,native -DskipTests -Dtar
或者这个
mvn package -Pdist,native,docs,src -DskipTests -Dtar
前面只编译本地代码,后者编译本地代码和文档,因此前者速度较快。
接下来就是漫长的等待,等出现这个就说明编译成功。
[INFO] ------------------------------------------------------------------------[INFO] Reactor Summary:[INFO] [INFO] Apache Hadoop Main ................................. SUCCESS [ 3.533 s][INFO] Apache Hadoop Project POM .......................... SUCCESS [ 2.023 s][INFO] Apache Hadoop Annotations .......................... SUCCESS [ 3.679 s][INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.275 s][INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 2.875 s][INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 4.856 s][INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 4.340 s][INFO] Apache Hadoop Auth ................................. SUCCESS [ 4.534 s][INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 5.398 s][INFO] Apache Hadoop Common ............................... SUCCESS [03:02 min][INFO] Apache Hadoop NFS .................................. SUCCESS [ 11.653 s][INFO] Apache Hadoop KMS .................................. SUCCESS [ 24.501 s][INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.112 s][INFO] Apache Hadoop HDFS ................................. SUCCESS [07:28 min][INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 41.608 s][INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 10.673 s][INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 7.225 s][INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.057 s][INFO] hadoop-yarn ........................................ SUCCESS [ 0.110 s][INFO] hadoop-yarn-api .................................... SUCCESS [03:36 min][INFO] hadoop-yarn-common ................................. SUCCESS [ 45.418 s][INFO] hadoop-yarn-server ................................. SUCCESS [ 0.164 s][INFO] hadoop-yarn-server-common .......................... SUCCESS [ 12.942 s][INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 19.200 s][INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [ 3.315 s][INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 7.855 s][INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 24.347 s][INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 6.439 s][INFO] hadoop-yarn-client ................................. SUCCESS [ 6.393 s][INFO] hadoop-yarn-server-sharedcachemanager .............. SUCCESS [ 3.445 s][INFO] hadoop-yarn-applications ........................... SUCCESS [ 0.075 s][INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [ 2.304 s][INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [ 2.026 s][INFO] hadoop-yarn-site ................................... SUCCESS [ 0.155 s][INFO] hadoop-yarn-registry ............................... SUCCESS [ 7.255 s][INFO] hadoop-yarn-project ................................ SUCCESS [ 11.871 s][INFO] hadoop-mapreduce-client ............................ SUCCESS [ 0.254 s][INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 26.029 s][INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 25.002 s][INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [ 3.792 s][INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 7.797 s][INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 5.143 s][INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 6.771 s][INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [ 1.837 s][INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 4.513 s][INFO] hadoop-mapreduce ................................... SUCCESS [ 6.842 s][INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 4.355 s][INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 14.910 s][INFO] Apache Hadoop Archives ............................. SUCCESS [ 2.844 s][INFO] Apache Hadoop Rumen ................................ SUCCESS [ 6.931 s][INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 3.937 s][INFO] Apache Hadoop Data Join ............................ SUCCESS [ 2.499 s][INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [ 2.268 s][INFO] Apache Hadoop Extras ............................... SUCCESS [ 2.739 s][INFO] Apache Hadoop Pipes ................................ SUCCESS [ 5.793 s][INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 4.444 s][INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [ 4.258 s][INFO] Apache Hadoop Azure support ........................ SUCCESS [ 47.689 s][INFO] Apache Hadoop Client ............................... SUCCESS [ 19.524 s][INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 0.305 s][INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 5.581 s][INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 25.708 s][INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.281 s][INFO] Apache Hadoop Distribution ......................... SUCCESS [01:53 min][INFO] ------------------------------------------------------------------------[INFO] BUILD SUCCESS[INFO] ------------------------------------------------------------------------[INFO] Total time: 24:49 min[INFO] Finished at: 2015-12-11T20:29:45+08:00[INFO] Final Memory: 110M/493M[INFO] ------------------------------------------------------------------------
编译好的文件在../hadoop-dist/target/hadoop-2.7.1.tar.gz
下。
编译中遇到的问题
错误1
Connection to http://repo.maven.apache.org refused
表示连接maven远程仓库拒绝,此时再运行一下编译命令,就会接着下载jar包。
错误2
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hadoop-nfs: Compilation failure: Compilation failure:[ERROR] /home/hadoop/toolkits/hadoop-2.7.1-src/hadoop-common-project/hadoop-nfs/src/main/java/org/apache/hadoop/oncrpc/XDR.java:[23,30] package org.jboss.netty.buffer does not exist
这个错误估计很少遇到,这是因为我嫌repo.maven.apache.org这个网站比较慢,更改的第三方镜像,导致maven中的settings文件配置错误,后来采用默认的就行,虽然慢点。
错误3
[ERROR] around Ant part ...<exec dir="/opt/soft/hadoop-2.7.1-src/hadoop-hdfs-project/hadoop-hdfs-httpfs/target" executable="sh" failonerror="true">... @ 10:123 in /opt/soft/hadoop-2.7.1-src/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/antrun/build-main.xml[ERROR] -> [Help 1]
这是由于tomcat的apache-tomcat-6.0.41.tar.gz包太大,没有下载完整,可以到.../hadoop-2.7.1-src/hadoop-hdfs-project/hadoop-hdfs-httpfs/downloads/apache-tomcat-6.0.41.tar.gz
这个目录下,删除重新下载。
提醒:
1、有时候编译过程中会出现下载某个包的时间太久,这是由于连接网站的过程中会出现假死,此时按ctrl+c
,重新运行编译命令。
2、如果出现缺少了某个文件的情况,则要先清理maven(使用命令 mvn clean
) 再重新编译。
====================================
写在最后:
经过了较长时间的折磨,终于成功,可能还是自己比较菜吧,有编译不成功的欢迎留言探讨。
- 本地编译Hadoop2.7.1源码总结和问题解决
- 本地编译Hadoop2.8.0源码总结和问题解决(转自:http://blog.csdn.net/young_kim1/article/details/50324345)
- hadoop2.7.1本地编译
- ubuntu编译hadoop2.7.1源码
- Hadoop2.x 体系结构和源码编译
- hadoop2.2 源码编译
- 编译hadoop2.2.0源码
- hadoop2.4.0源码编译
- hadoop2.2.0源码编译
- Hadoop2.4.1 源码编译
- hadoop2.4.0源码编译
- 编译hadoop2.6.0源码
- Hadoop2.6源码编译
- hadoop2.2.0源码编译
- hadoop2.6.1源码编译
- hadoop2.5.0源码编译
- hadoop2.6源码编译
- hadoop2.x源码编译
- PS邪术三次元照片快速转二次风格
- java中的volatile关键字的功能详解
- Android设计中的.9.png
- 『HTML5梦幻之旅』 - 动感圆圈
- android怎样调用@hide和internal API
- 本地编译Hadoop2.7.1源码总结和问题解决
- JQUERY .closest() .parents()区别
- Android与Html
- SPARK
- hibernate annotation注解方式来处理映射关系
- 服务器被入侵的教训
- leetcode -- Product of Array Except Self -- 重点。常考
- Hive
- 项目总结新技术在项目中的应用风险和机遇