Hadoop 2.x build native library on Mac os x

来源:互联网 发布:clip studio paint mac 编辑:程序博客网 时间:2024/05/16 13:45

转自:http://www.micmiu.com/bigdata/hadoop/hadoop-build-native-library-on-mac-os-x/


查阅有关官方介绍 http://wiki.apache.org/hadoop/HowToContribute 中有说明:Hadoop本地库只支持*nix平台,已经广泛使用在GNU/Linux平台上,但是不支持 Cygwin  和 Mac OS X 。搜索后发现已经有人给出了Mac OSX 系统下编译生成本地库的patch,下面详细介绍在Mac OSX 平台下编译Hadoop本地库的方法。

[一]、环境说明:

  • Hadoop 2.2.0
  • Mac OS X 10.9.1

详细的环境依赖(protoc、cmake 等)参见:Hadoop2.2.0源码编译 (http://www.micmiu.com/opensource/hadoop/hadoop-build-source-2-2-0/)中介绍。

[二]、Mac OSX 编译本地库的步骤:

 1、checkout Hadoop 2.2.0的源码

1$svn co https://svn.apache.org/repos/asf/hadoop/common/tags/release-2.2.0/

2、patch 相关补丁

官方讨论地址:https://issues.apache.org/jira/browse/HADOOP-9648 里面有详细介绍

补丁下载链接:https://issues.apache.org/jira/secure/attachment/12617363/HADOOP-9648.v2.patch

1#切换到hadoop 源码的根目录
2$wget https://issues.apache.org/jira/secure/attachment/12617363/HADOOP-9648.v2.patch
3$patch -p1 < HADOOP-9648.v2.patch

ps:如果要回退patch 执行:patch -RE -p1 < HADOOP-9648.v2.patch 即可。

3、编译本地库

在Hadoop源码的根目录下执行编译本地库命令:

1$ mvn package -Pdist,native -DskipTests -Dtar

编译成功看到如下日志信息:

1[INFO] ------------------------------------------------------------------------
2[INFO] Reactor Summary:
3[INFO]
4[INFO] Apache Hadoop Main ................................ SUCCESS [1.511s]
5[INFO] Apache Hadoop Project POM ......................... SUCCESS [0.493s]
6[INFO] Apache Hadoop Annotations ......................... SUCCESS [0.823s]
7[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [0.561s]
8[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.245s]
9[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [2.465s]
10[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [0.749s]
11[INFO] Apache Hadoop Auth ................................ SUCCESS [0.832s]
12[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [2.070s]
13[INFO] Apache Hadoop Common .............................. SUCCESS [1:00.030s]
14[INFO] Apache Hadoop NFS ................................. SUCCESS [0.285s]
15[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.049s]
16[INFO] Apache Hadoop HDFS ................................ SUCCESS [1:13.339s]
17[INFO] Apache Hadoop HttpFS .............................. SUCCESS [20.259s]
18[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [0.767s]
19[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [0.279s]
20[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.046s]
21[INFO] hadoop-yarn ....................................... SUCCESS [0.239s]
22[INFO] hadoop-yarn-api ................................... SUCCESS [7.641s]
23[INFO] hadoop-yarn-common ................................ SUCCESS [5.479s]
24[INFO] hadoop-yarn-server ................................ SUCCESS [0.114s]
25[INFO] hadoop-yarn-server-common ......................... SUCCESS [1.743s]
26[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [6.381s]
27[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [0.259s]
28[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [0.578s]
29[INFO] hadoop-yarn-server-tests .......................... SUCCESS [0.303s]
30[INFO] hadoop-yarn-client ................................ SUCCESS [0.233s]
31[INFO] hadoop-yarn-applications .......................... SUCCESS [0.062s]
32[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [0.253s]
33[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.074s]
34[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [1.504s]
35[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [0.242s]
36[INFO] hadoop-yarn-site .................................. SUCCESS [0.172s]
37[INFO] hadoop-yarn-project ............................... SUCCESS [1.235s]
38[INFO] hadoop-mapreduce-client-common .................... SUCCESS [3.664s]
39[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [0.183s]
40[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [0.495s]
41[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [1.296s]
42[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [0.580s]
43[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [0.213s]
44[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [0.344s]
45[INFO] hadoop-mapreduce .................................. SUCCESS [1.303s]
46[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [0.257s]
47[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [9.925s]
48[INFO] Apache Hadoop Archives ............................ SUCCESS [0.282s]
49[INFO] Apache Hadoop Rumen ............................... SUCCESS [0.403s]
50[INFO] Apache Hadoop Gridmix ............................. SUCCESS [0.283s]
51[INFO] Apache Hadoop Data Join ........................... SUCCESS [0.197s]
52[INFO] Apache Hadoop Extras .............................. SUCCESS [0.241s]
53[INFO] Apache Hadoop Pipes ............................... SUCCESS [8.249s]
54[INFO] Apache Hadoop OpenStack support ................... SUCCESS [0.492s]
55[INFO] Apache Hadoop Client .............................. SUCCESS [0.373s]
56[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.133s]
57[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [0.439s]
58[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [0.596s]
59[INFO] Apache Hadoop Tools ............................... SUCCESS [0.044s]
60[INFO] Apache Hadoop Distribution ........................ SUCCESS [0.194s]
61[INFO] ------------------------------------------------------------------------
62[INFO] BUILD SUCCESS
63[INFO] ------------------------------------------------------------------------
64[INFO] Total time: 3:44.266s
65[INFO] Finished at: Fri Jan 17 10:06:17 CST 2014
66[INFO] Final Memory: 66M/123M
67[INFO] ------------------------------------------------------------------------
68micmiu-mbp:trunk micmiu$

编译通过后可在 <HADOOP源码根目录>/hadoop-dist/target/hadoop-2.2.0/lib/ 目录下看到如下内容:

1micmiu-mbp:lib micmiu$ tree
2.
3|____.DS_Store
4|____native
5| |____libhadoop.1.0.0.dylib
6| |____libhadoop.a
7| |____libhadoop.dylib
8| |____libhadooppipes.a
9| |____libhadooputils.a
10| |____libhdfs.0.0.0.dylib
11| |____libhdfs.a
12| |____libhdfs.dylib

然后把 上面生成的本地库 copy到部署环境相应的位置,再建立软连接即可:

1$ls -s libhadoop.1.0.0.dylib libhadoop.so
2$ls -s libhdfs.0.0.0.dylib libhdfs.so

下载地址:http://yun.baidu.com/s/1c0jBZDQ

[三]、参考:

  • http://wiki.apache.org/hadoop/HowToContribute
  • http://hadoop.apache.org/docs/r2.2.0/hadoop-project-dist/hadoop-common/NativeLibraries.html
  • https://issues.apache.org/jira/browse/HADOOP-9648
  • https://issues.apache.org/jira/browse/HADOOP-3659 


0 0
原创粉丝点击