CentOS6.4编译Hadoop-2.2.0
来源:互联网 发布:窗帘 知乎 编辑:程序博客网 时间:2024/05/18 00:51
来源:http://www.cnblogs.com/toughhou/p/3864273.html
因为搭建Hadoop环境的时候,所用的系统镜像是emi-centos-6.4-x86_64,是64位的,而hadoop是默认是32的安装包。这导致我们很多操作都会遇到hadoop本地库加载失败的问题:
(Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /usr/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.)
14/07/29 02:36:34 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
14/07/29 02:36:34 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /usr/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0: /usr/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0: wrong ELF class: ELFCLASS32 (Possible cause: architecture word width mismatch)
14/07/29 02:36:34 DEBUG util.NativeCodeLoader: java.library.path=/usr/hadoop/hadoop-2.2.0/lib/native
14/07/29 02:36:34 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/07/29 02:36:34 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /usr/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0: /usr/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0: wrong ELF class: ELFCLASS32 (Possible cause: architecture word width mismatch)
14/07/29 02:36:34 DEBUG util.NativeCodeLoader: java.library.path=/usr/hadoop/hadoop-2.2.0/lib/native
14/07/29 02:36:34 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
为了解决此问题,需要重新编译hadoop。把生成的 hadoop-2.2.0-src/hadoop-dist/target/hadoop-2.2.0/lib/native 覆盖到 /usr/hadoop-2.2.0/lib/native。
以下是具体的编译步骤:
以下是具体的编译步骤:
[注意一下命令最好使用root运行,或sudo]
1. 安装下面的软件
1) 在name节点上先编译hadoop
网上看了下,是因为cmake没安装引起的。安装一下再试。
再次编译,最终成功了。
2) 把编译后的hadoop的native目录copy到/usr/hadoop-2.2.0/lib/
3) 把编译后的hadoop的native目录scp其它节点
1. 安装下面的软件
[root@hd1 software]# yum install lzo-devel zlib-devel gcc autoconf automake libtool ncurses-devel openssl-deve
2. 安装Maven
[hxiaolong@hd1 software]$ wget http://www.us.apache.org/dist/maven/maven-3/3.2.2/binaries/apache-maven-3.2.2-bin.tar.gz[hxiaolong@hd1 software]$ tar zxf apache-maven-3.2.2-bin.tar.gz -C /usr[hxiaolong@hd1 software]$ vi ~/.bash_profileexport MAVEN_HOME=/usr/apache-maven-3.2.2export PATH=$PATH:$MAVEN_HOME/bin
3. 安装Ant
[hxiaolong@hd1 software]$ wget http://mirror.bit.edu.cn/apache/ant/binaries/apache-ant-1.9.4-bin.tar.gz[hxiaolong@hd1 software]$ tar zxf apache-ant-1.9.4-bin.tar.gz -C /usr[hxiaolong@hd1 software]$ vi ~/.bash_profile# Antexport ANT_HOME=/usr/apache-ant-1.9.4export PATH=$PATH:$ANT_HOME/bin
4. 安装Findbugs
[hxiaolong@hd1 software]$ wget http://prdownloads.sourceforge.net/findbugs/findbugs-2.0.3.tar.gz?download[hxiaolong@hd1 software]$ tar zxf findbugs-2.0.3.tar.gz -C /usr[hxiaolong@hd1 software]$ vi ~/.bash_profile# Findbugexport FINDBUGS_HOME=/usr/findbugs-2.0.3export PATH=$PATH:$FINDBUGS_HOME/bin
5. 安装protobuf
[hxiaolong@hd1 software]$ wget https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.bz2
[hxiaolong@hd1 software]$ tar zxf protobuf-2.5.0.tar.gz
[hxiaolong@hd1 software]$ cd protobuf-2.5.0[hxiaolong@hd1 software]$ ./configure[hxiaolong@hd1 software]$ make[hxiaolong@hd1 software]$ make install
说实话,上面这种编译、安装方式挺麻烦的。很容易碰到各种依赖问题。这里推荐用yum install来安装。
[root@hd1 protobuf-2.5.0]# yum install protobuf
6. 编译Hadoop
1) 在name节点上先编译hadoop
[hxiaolong@hd1 software]$ wget http://mirrors.cnnic.cn/apache/hadoop/common/hadoop-2.2.0/hadoop-2.2.0-src.tar.gz
[hxiaolong@hd1 software]$ tar zxf hadoop-2.2.0-src.tar.gz
[hxiaolong@hd1 software]$ cd hadoop-2.2.0-src
[hxiaolong@hd1 software]$ mvn package -DskipTests -Pdist,native -Dtar
中间过程出错了,错误信息如下:
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-common: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "cmake" (in directory "/home/hxiaolong/software/hadoop-2.2.0-src/hadoop-common-project/hadoop-common/target/native"): error=2, No such file or directory[ERROR] around Ant part ...<exec dir="/home/hxiaolong/software/hadoop-2.4.0-src/hadoop-common-project/hadoop-common/target/native" executable="cmake" failonerror="true">... @ 4:145 in /home/hxiaolong/software/hadoop-2.4.0-src/hadoop-common-project/hadoop-common/target/antrun/build-main.xml[ERROR] -> [Help 1][ERROR][ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.[ERROR] Re-run Maven using the -X switch to enable full debug logging.[ERROR][ERROR] For more information about the errors and possible solutions, please read the following articles:[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException[ERROR][ERROR] After correcting the problems, you can resume the build with the command[ERROR] mvn <goals> -rf :hadoop-common
[root@hd1 hadoop-2.4.0-src]# yum instsall cmake
重新编译, 仍然报错:cannot access AbstractLifeCycle
经过查证发现是hadoop2.2.0的一个bug,具体参见https://issues.apache.org/jira/browse/HADOOP-10110
解决方法:
修改hadoop-2.2.0-src/hadoop-common-project/hadoop-auth/pom.xml,将
<dependency> <groupId>org.mortbay.jetty</groupId> <artifactId>jetty</artifactId> <scope>test</scope></dependency>
修改为
<dependency> <groupId>org.mortbay.jetty</groupId> <artifactId>jetty-util</artifactId> <scope>test</scope></dependency><dependency> <groupId>org.mortbay.jetty</groupId> <artifactId>jetty</artifactId> <scope>test</scope></dependency>
再次编译,最终成功了。
[hxiaolong@hd1 software]$ mvn package -DskipTests -Pdist,native -Dtarmain: [exec] $ tar cf hadoop-2.2.0.tar hadoop-2.2.0 [exec] $ gzip -f hadoop-2.2.0.tar [exec] [exec] Hadoop dist tar available at: /home/hxiaolong/software/hadoop-2.2.0-src/hadoop-dist/target/hadoop-2.2.0.tar.gz [INFO] BUILD SUCCESS[INFO] ------------------------------------------------------------------------[INFO] Total time: 12:41.833s[INFO] Finished at: Wed Jul 23 03:01:18 UTC 2014[INFO] Final Memory: 159M/646M[INFO] ------------------------------------------------------------------------
2) 把编译后的hadoop的native目录copy到/usr/hadoop-2.2.0/lib/
[hxiaolong@hd1 lib]$ rm -rf /opt/hadoop-2.2.0/lib/native[hxiaolong@hd1 lib]$ cp -R /home/hxiaolong/software/hadoop-2.2.0-src/hadoop-dist/target/hadoop-2.2.0/lib/native /usr/hadoop-2.2.0/lib/
这是非常重要的一个步骤。
3) 把编译后的hadoop的native目录scp其它节点
[root@hd1 lib]# scp -r /home/hxiaolong/software/hadoop-2.2.0-src/hadoop-dist/target/hadoop-2.2.0/lib/native/ hadoop2:/opt/hadoop-2.2.0/lib/ [root@hd1 lib]# scp -r /home/hxiaolong/software/hadoop-2.2.0-src/hadoop-dist/target/hadoop-2.2.0/lib/native/ hadoop3:/opt/hadoop-2.2.0/lib/
如果不把重新编译过后的native目录同步到其它节点,那在其它节点也会遇到同样的问题。
4) 验证
[hxiaolong@hd2 native]$ hadoop fs -ls /Found 1 itemsdrwxr-xr-x - hxiaolong supergroup 0 2014-07-23 05:21 /input
OK了,不会报错了。
REF:
http://stackoverflow.com/questions/18889113/running-java-5-6-with-jni-on-java-7-gives-stack-guard-warning
http://blog.yidooo.net/archives/hadoop-source-compile.html
http://blog.csdn.net/jiedushi/article/details/7496327
0 0
- CentOS6.4编译Hadoop-2.2.0
- CentOS6.8上编译Hadoop-2.2.0
- CentOS6.5、Hadoop-2.2.0、64 位的编译
- Centos6.5 下编译64位 Hadoop 2.2.0
- 在centos6.4 编译64位的hadoop 2.4.1
- 在centos6.4 编译64位的hadoop 2.4.1
- 64bit Centos6.4编译hadoop-2.5.1
- Hadoop-2.9.0 编译 (CentOS6.7 64位)
- CentOS6.5下编译Hadoop-2.7.2
- CentOS6.8编译Hadoop-2.4.1
- 编译 hadoop 2.2.0
- 编译 hadoop 2.2.0
- [ Linux 下 Hadoop 编译]CentOS6.4_64位下编译Hadoop2.2.0
- 【hadoop之翊】——CentOS6.5 Linux上面编译Hadoop2.4源码
- hadoop 2.2.0编译WordCount
- hadoop-2.2.0源码编译
- hadoop-2.2.0源码编译
- CentOS6.4配置Hadoop-2.6.0集群配置安装
- STC12C4052部分调试成功的程序
- 从头到尾彻底理解KMP(2014年7月版)
- java MD5实现
- 吉利全球鹰熊猫CROSS
- selenium webdriver之(1).eclipse java开发环境搭建
- CentOS6.4编译Hadoop-2.2.0
- CSS选择器分类、选择器优先级以及CSS优化原则
- 产品经理入职30天内最应做的12件事(谷歌风投合伙人)
- wait()和waitpid()函数
- Python学习(04):python代码打包与发布
- 反渗透设备:反渗透主机装置与工艺流程
- HDU 1059 Dividing 多重背包
- RTP协议分析
- java设计模式之装饰模式(7)