CentOS 6.4 64位 源码编译 hadoop 2.2.0

来源:互联网 发布:远程解锁淘宝安卓 编辑:程序博客网 时间:2024/04/30 11:49

搭建环境:Centos 6.4 64bit

1、安装JDK 参考这里
2、安装maven
maven官方下载地址,可以选择源码编码安装,这里就直接下载编译好的
wget http://mirror.bit.edu.cn/apache/maven/maven-3/3.1.1/binaries/apache-maven-3.1.1-bin.zip
解压文件后,同样在/etc/profie里配置环境变量

vim /etc/profieexport MAVEN_HOME=/opt/maven3.1.1export PATH=$PATH:$MAVEN_HOME/binsource /etc/profile
验证配置是否成功: mvn -version

Apache Maven 3.1.1 (0728685237757ffbf44136acec0402957f723d9a; 2013-09-17 23:22:22+0800)Maven home: /opt/maven3.1.1Java version: 1.7.0_45, vendor: Oracle CorporationJava home: /opt/jdk1.7/jreDefault locale: en_US, platform encoding: UTF-8OS name: "linux", version: "2.6.32-358.el6.x86_64", arch: "amd64", family: "unix"

由于maven国外服务器可能连不上,先给maven配置一下国内镜像,在maven目录下,conf/settings.xml,在<mirrors></mirros>里添加,原本的不要动

<mirror>    <id>nexus-osc</id>    <mirrorOf>*</mirrorOf>    <name>Nexusosc</name>    <url>http://maven.oschina.net/content/groups/public/</url></mirror>
同样,在<profiles></profiles>内新添加
<profile>    <id>jdk-1.7</id>    <activation>        <jdk>1.7</jdk>    </activation>    <repositories>        <repository>            <id>nexus</id>            <name>local private nexus</name>            <url>http://maven.oschina.net/content/groups/public/</url>            <releases>                <enabled>true</enabled>            </releases>            <snapshots>                <enabled>false</enabled>            </snapshots>        </repository>    </repositories>    <pluginRepositories>        <pluginRepository>            <id>nexus</id>            <name>local private nexus</name>            <url>http://maven.oschina.net/content/groups/public/</url>            <releases>                <enabled>true</enabled>            </releases>            <snapshots>                <enabled>false</enabled>            </snapshots>        </pluginRepository>    </pluginRepositories></profile>
3、安装protoc2.5.0
hadoop2.2.0编译需要protoc2.5.0的支持,所以还要下载protoc,
下载地址:https://code.google.com/p/protobuf/downloads/list,要下载2.5.0版本噢
对protoc进行编译安装前先要装几个依赖包:file,gcc,gcc-c++,make 如果已经安装的可以忽略

yum install fileyum install gccyum intall gcc-c++yum install make

安装protoc

tar -zxvf protobuf-2.5.0.tar.gzcd protobuf-2.5.0./configure --prefix=/usr/makemake installldconfig -v
4、需要安装cmake,openssl-devel,ncurses-devel依赖 如果已经安装的可以忽略
yum install cmakeyum install openssl-develyum install ncurses-devel
5、编译hadoop
首先官方下载hadoop源码
wget http://mirrors.cnnic.cn/apache/hadoop/common/hadoop-2.2.0/hadoop-2.2.0-src.tar.gz
现在可以进行编译了,

cd hadoop2.2.0-srcmvn package -Pdist,native -DskipTests -Dtar[INFO] ------------------------------------------------------------------------[INFO] Reactor Summary:[INFO][INFO] Apache Hadoop Main ................................ SUCCESS [3.709s][INFO] Apache Hadoop Project POM ......................... SUCCESS [2.229s][INFO] Apache Hadoop Annotations ......................... SUCCESS [5.270s][INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.388s][INFO] Apache Hadoop Project Dist POM .................... SUCCESS [3.485s][INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [8.655s][INFO] Apache Hadoop Auth ................................ SUCCESS [7.782s][INFO] Apache Hadoop Auth Examples ....................... SUCCESS [5.731s][INFO] Apache Hadoop Common .............................. SUCCESS [1:52.476s][INFO] Apache Hadoop NFS ................................. SUCCESS [9.935s][INFO] Apache Hadoop Common Project ...................... SUCCESS [0.110s][INFO] Apache Hadoop HDFS ................................ SUCCESS [1:58.347s][INFO] Apache Hadoop HttpFS .............................. SUCCESS [26.915s][INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [17.002s][INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [5.292s][INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.073s][INFO] hadoop-yarn ....................................... SUCCESS [0.335s][INFO] hadoop-yarn-api ................................... SUCCESS [54.478s][INFO] hadoop-yarn-common ................................ SUCCESS [39.215s][INFO] hadoop-yarn-server ................................ SUCCESS [0.241s][INFO] hadoop-yarn-server-common ......................... SUCCESS [15.601s][INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [21.566s][INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [4.754s][INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [20.625s][INFO] hadoop-yarn-server-tests .......................... SUCCESS [0.755s][INFO] hadoop-yarn-client ................................ SUCCESS [6.748s][INFO] hadoop-yarn-applications .......................... SUCCESS [0.155s][INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [4.661s][INFO] hadoop-mapreduce-client ........................... SUCCESS [0.160s][INFO] hadoop-mapreduce-client-core ...................... SUCCESS [36.090s][INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [2.753s][INFO] hadoop-yarn-site .................................. SUCCESS [0.151s][INFO] hadoop-yarn-project ............................... SUCCESS [4.771s][INFO] hadoop-mapreduce-client-common .................... SUCCESS [24.870s][INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [3.812s][INFO] hadoop-mapreduce-client-app ....................... SUCCESS [15.759s][INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [6.831s][INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [8.126s][INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [2.320s][INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [9.596s][INFO] hadoop-mapreduce .................................. SUCCESS [3.905s][INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [7.118s][INFO] Apache Hadoop Distributed Copy .................... SUCCESS [11.651s][INFO] Apache Hadoop Archives ............................ SUCCESS [2.671s][INFO] Apache Hadoop Rumen ............................... SUCCESS [10.038s][INFO] Apache Hadoop Gridmix ............................. SUCCESS [6.062s][INFO] Apache Hadoop Data Join ........................... SUCCESS [4.104s][INFO] Apache Hadoop Extras .............................. SUCCESS [4.210s][INFO] Apache Hadoop Pipes ............................... SUCCESS [9.419s][INFO] Apache Hadoop Tools Dist .......................... SUCCESS [2.306s][INFO] Apache Hadoop Tools ............................... SUCCESS [0.037s][INFO] Apache Hadoop Distribution ........................ SUCCESS [21.579s][INFO] Apache Hadoop Client .............................. SUCCESS [7.299s][INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [7.347s][INFO] ------------------------------------------------------------------------[INFO] BUILD SUCCESS[INFO] ------------------------------------------------------------------------[INFO] Total time: 11:53.144s[INFO] Finished at: Fri Nov 22 16:58:32 CST 2013[INFO] Final Memory: 70M/239M[INFO] ------------------------------------------------------------------------
直到看到上面的内容那就说明编译完成了。

编译后的路径在:hadoop-2.2.0-src/hadoop-dist/target/hadoop-2.2.0
通过一下命令可以看出hadoop的版本

[root@localhost bin]# ./hadoop versionHadoop 2.2.0Subversion Unknown -r UnknownCompiled by root on 2013-11-22T08:47ZCompiled with protoc 2.5.0From source with checksum 79e53ce7994d1628b240f09af91e1af4This command was run using /data/hadoop-2.2.0-src/hadoop-dist/target/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar[root@localhost hadoop-2.2.0]# file lib//native/*lib//native/libhadoop.a: current ar archivelib//native/libhadooppipes.a: current ar archivelib//native/libhadoop.so: symbolic link to `libhadoop.so.1.0.0'lib//native/libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not strippedlib//native/libhadooputils.a: current ar archivelib//native/libhdfs.a: current ar archivelib//native/libhdfs.so: symbolic link to `libhdfs.so.0.0.0'lib//native/libhdfs.so.0.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped

0 0
原创粉丝点击