编译CDH4.2.2源码

来源:互联网 发布:华科网络自助服务中心 编辑:程序博客网 时间:2024/05/16 14:03
centos 6.3 32位操作系统1 mvn,jdk具体安装此处不细说了,重点介绍编译hadoop源码相关 Native librariesOn Linux, you need the tools to create the native libraries. For RHEL (and hence also CentOS): 需要先执行:yum -y install  lzo-devel  zlib-devel  gcc autoconf automake libtool openssl ncurses-devel openssl-devel2 cd 到 HADOOP_HOME目录下,执行mvn install -DskipTests第二步在编译hadoop_common项目时会报错,用到了谷歌的protobuf 2.3+,需要先安装它,这里下载最新的2.5.0版本:1、protobuf是google公司提出的数据存储格式,详细介绍可以参考:https://code.google.com/p/protobuf/2、下载最新的protobuf,下载地址:https://code.google.com/p/protobuf/downloads/list3、下载protobuf2.5.o版本,protobuf-2.5.0.tar.gz解压并进行安装。解压:tar xvf protobuf-2.5.0.tar.gz安装步骤:(1)./configure (2)make (3)make check (4)make install然后继续install,还是报错,需要修改一下hadoop-common-project\hadoop-common 下的pom.xml文件:找到    <dependency>      <groupId>com.google.protobuf</groupId>      <artifactId>protobuf-java</artifactId>      <scope>compile</scope>    </dependency>添加version:    <dependency>      <groupId>com.google.protobuf</groupId>      <artifactId>protobuf-java</artifactId>      <version>2.5.0</version>      <scope>compile</scope>    </dependency>     ok,再install,成功。  后续还会遇到相同的问题:src\hadoop-common-project\hadoop-commonsrc\hadoop-hdfs-project\hadoop-hdfs src\hadoop-mapreduce-project\hadoop-mapreduce-clientsrc\hadoop-mapreduce-project src\hadoop-projectsrc\hadoop-yarn-project\hadoop-yarn src\hadoop-yarn-project这些目录下的pom.xml都要改一下。参考http://wiki.apache.org/hadoop/HowToContribute
centos 6.3 32位操作系统
1 mvn
,jdk具体安装此处不细说了,重点介绍编译hadoop源码相关
 Native libraries
On Linux, you need the tools to create the native libraries.
For RHEL (and hence also CentOS):
需要先执行:yum -y install  lzo-devel  zlib-devel  gcc autoconf automake libtool openssl ncurses-devel openssl-devel

2 cd 到 HADOOP_HOME目录下,执行mvn install -DskipTests
第二步在编译hadoop_common项目时会报错,用到了谷歌的protobuf 2.3+,需要先安装它,这里下载最新的2.5.0版本:
1、protobuf是google公司提出的数据存储格式,详细介绍可以参考:https://code.google.com/p/protobuf/
2、下载最新的protobuf,下载地址:https://code.google.com/p/protobuf/downloads/list
3、下载protobuf2.5.o版本,protobuf-2.5.0.tar.gz解压并进行安装。
解压:tar xvf protobuf-2.5.0.tar.gz
安装步骤:(1)./configure (2)make (3)make check (4)make install

然后继续install,还是报错,需要修改一下hadoop-common-project\hadoop-common 下的pom.xml文件:
找到    <dependency>
      <groupId>com.google.protobuf</groupId>
      <artifactId>protobuf-java</artifactId>
      <scope>compile</scope>
    </dependency>
添加version:    <dependency>
      <groupId>com.google.protobuf</groupId>
      <artifactId>protobuf-java</artifactId>
      <version>2.5.0</version>
      <scope>compile</scope>
    </dependency>
   
 ok,再install,成功。
 
后续还会遇到相同的问题:
src\hadoop-common-project\hadoop-common
src\hadoop-hdfs-project\hadoop-hdfs
src\hadoop-mapreduce-project\hadoop-mapreduce-client
src\hadoop-mapreduce-project
src\hadoop-project
src\hadoop-yarn-project\hadoop-yarn
src\hadoop-yarn-project
这些目录下的pom.xml都要改一下。
参考http://wiki.apache.org/hadoop/HowToContribute

centos 6.3 32位操作系统

1 mvn
,jdk具体安装此处不细说了,重点介绍编译hadoop源码相关
 Native libraries
On Linux, you need the tools to create the native libraries.
For RHEL (and hence also CentOS):
需要先执行:yum -y install  lzo-devel  zlib-devel  gcc autoconf automake libtool openssl ncurses-devel openssl-devel


2 cd 到 HADOOP_HOME目录下,执行mvn install -DskipTests
第二步在编译hadoop_common项目时会报错,用到了谷歌的protobuf 2.3+,需要先安装它,这里下载最新的2.5.0版本:
1、protobuf是google公司提出的数据存储格式,详细介绍可以参考:https://code.google.com/p/protobuf/

2、下载最新的protobuf,下载地址:https://code.google.com/p/protobuf/downloads/list

3、下载protobuf2.5.o版本,protobuf-2.5.0.tar.gz解压并进行安装。

解压:tar xvf protobuf-2.5.0.tar.gz

安装步骤:(1)./configure (2)make (3)make check (4)make install

 

然后继续install,还是报错,需要修改一下hadoop-common-project\hadoop-common 下的pom.xml文件:
找到    <dependency>
      <groupId>com.google.protobuf</groupId>
      <artifactId>protobuf-java</artifactId>
      <scope>compile</scope>
    </dependency>
添加version:    <dependency>
      <groupId>com.google.protobuf</groupId>
      <artifactId>protobuf-java</artifactId>
      <version>2.5.0</version>
      <scope>compile</scope>
    </dependency>
   
 ok,再install,成功。
 
后续还会遇到相同的问题:
src\hadoop-common-project\hadoop-common
src\hadoop-hdfs-project\hadoop-hdfs
src\hadoop-mapreduce-project\hadoop-mapreduce-client
src\hadoop-mapreduce-project
src\hadoop-project
src\hadoop-yarn-project\hadoop-yarn
src\hadoop-yarn-project
这些目录下的pom.xml都要改一下。
参考http://wiki.apache.org/hadoop/HowToContribute

0 0
原创粉丝点击