MAC OS 运行hadoop提示util.NativeCodeLoader: Unable to load native-hadoop library for your platform的解决

来源:互联网 发布:java重载与多态 编辑:程序博客网 时间:2024/05/19 16:02

在Mac OS 测试Hadoop时,无论是启动Hadoop服务,还是运行Hadoop命令,均会提示以下警告信息:

./start-dfs.sh 17/04/23 18:20:15 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicableStarting namenodes on [localhost]localhost: starting namenode, logging to /Users/Starshine/Work/hadoop-2.6.4/logs/hadoop-Starshine-namenode-xjw.outlocalhost: starting datanode, logging to /Users/Starshine/Work/hadoop-2.6.4/logs/hadoop-Starshine-datanode-xjw.out

问题的原因是不能加载基于本地平台(如本机是Mac OS,或者其他平台)的本地库(动态库或静态库),解决的办法很简单,就是在本机对Hadoop进行重新编译,生成本地库,将编译生成的本地库拷贝到Hadoop下即可。


编译Hadoop必须提供-Pnative参数(即引用Maven POM中定义的profile native),编译命令如下

mvn package -Pdist,native -DskipTests -Dtar -Dmaven.javadoc.skip=true


编译过程会遇到一些坑,只要有耐心都能解决,其实为解决这个问题,只需能够保证Common Project能够编译完成即可:

[INFO] ------------------------------------------------------------------------[INFO] Reactor Summary:[INFO] [INFO] Apache Hadoop Main ................................. SUCCESS [  1.374 s][INFO] Apache Hadoop Project POM .......................... SUCCESS [  1.303 s][INFO] Apache Hadoop Annotations .......................... SUCCESS [  2.000 s][INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.360 s][INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  1.488 s][INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  2.261 s][INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  1.523 s][INFO] Apache Hadoop Auth ................................. SUCCESS [  2.278 s][INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  1.243 s][INFO] Apache Hadoop Common ............................... SUCCESS [ 47.707 s][INFO] Apache Hadoop NFS .................................. SUCCESS [  2.052 s][INFO] Apache Hadoop KMS .................................. SUCCESS [  4.838 s][INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.035 s]

我的编译就没有将所有项目编译完成,卡在Hadoop Pipes上面了,在HadoopPipes.cc文件中使用了未定义的结构体声明(hmac_ctx_st),研究了一下,发现编译openssl时,有写evp、hmac的头文件并没有拷贝到make install相关的目录下,引用还比较复杂,手工拷贝过去,还有其他错误,放弃了,以后研究吧,因为其不影响解决上述这个问题。


编译完成后,在hadoop-2.6.4-src/hadoop-common-project/hadoop-common/target/native/target/usr/local/lib下,生成了以下3个文件:

-rwxr-xr-x  1 Starshine  staff  149572  4 22 18:36 libhadoop.1.0.0.dylib-rw-r--r--  1 Starshine  staff  673720  4 22 18:36 libhadoop.alrwxr-xr-x  1 Starshine  staff      21  4 22 18:36 libhadoop.dylib -> libhadoop.1.0.0.dylib

其中libhadoop.dylib是对libhadoop.1.0.0.dylib的引用,将这两个文件(也可以只拷贝libhadoop.dylib)拷贝到Hadoop安装目录下的lib/native目录下即可。


./start-dfs.sh Starting namenodes on [localhost]localhost: starting namenode, logging to /Users/Starshine/Work/hadoop-2.6.4/logs/hadoop-Starshine-namenode-xjw.outlocalhost: starting datanode, logging to /Users/Starshine/Work/hadoop-2.6.4/logs/hadoop-Starshine-datanode-xjw.outStarting secondary namenodes [localhost]localhost: starting secondarynamenode, logging to /Users/Starshine/Work/hadoop-2.6.4/logs/hadoop-Starshine-secondarynamenode-xjw.out
再启动hadoop或者执行相关命令,上面的警告信息不再出现。










0 0
原创粉丝点击