Hadoop Pipes “Server failed to authenticate”错误及解决

来源:互联网 发布:pdm产品数据管理软件 编辑:程序博客网 时间:2024/04/30 13:18

问题描述:

《hadoop实战》(第2版)3.5节的Hadoop Pipes例子。

makefile的内容:
HADOOP_INSTALL=/home/xxl/hadoop-1.1.2
PLATFORM=Linux-i386-32
SSL_INSTALL=/usr/local/ssl
CC=g++
CPPFLAGS=-m32 -I$(HADOOP_INSTALL)/c++/$(PLATFORM)/include -I$(SSL_INSTALL)/include
wordcount: wordcount.cpp
    $(CC) $(CPPFLAGS) $< -Wall -L$(HADOOP_INSTALL)/c++/$(PLATFORM)/lib -lhadooppipes -lhadooputils \
        -L$(SSL_INSTALL)/lib -lcrypto -lssl -ldl -lpthread -g -O2 -o $@

上述makefile内容没有问题

将可执行文件上传到bin文件夹内
~/hadoop-1.1.2/bin/hadoop fs -mkdir bin
~/hadoop-1.1.2/bin/hadoop dfs -put wordcount bin

运行这个wordcount程序
~/hadoop-1.1.2/bin/hadoop pipes -D hadoop.pipes.java.recordreader=true -D hadoop.pipes.java.recordwriter=true -input input -output output -program bin/wordcount


此时出现如下问题:

xxl@xxl-pc:~/MapReduce/wordcount_cpp$ ~/hadoop-1.1.2/bin/hadoop pipes -D hadoop.pipes.java.recordreader=true -D hadoop.pipes.java.recordwriter=true -input /user/xxl/input/file0* -output /user/xxl/output/outputfile -program bin/wordcount13/10/04 22:29:21 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).13/10/04 22:29:22 INFO util.NativeCodeLoader: Loaded the native-hadoop library13/10/04 22:29:22 WARN snappy.LoadSnappy: Snappy native library not loaded13/10/04 22:29:22 INFO mapred.FileInputFormat: Total input paths to process : 213/10/04 22:29:22 INFO mapred.JobClient: Running job: job_201310041509_001713/10/04 22:29:23 INFO mapred.JobClient:  map 0% reduce 0%13/10/04 22:29:32 INFO mapred.JobClient: Task Id : attempt_201310041509_0017_m_000000_0, Status : FAILEDjava.io.IOExceptionat org.apache.hadoop.mapred.pipes.OutputHandler.waitForAuthentication(OutputHandler.java:188)at org.apache.hadoop.mapred.pipes.Application.waitForAuthentication(Application.java:194)at org.apache.hadoop.mapred.pipes.Application.<init>(Application.java:149)at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:68)at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)at org.apache.hadoop.mapred.Child$4.run(Child.java:255)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:416)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)at org.apache.hadoop.mapred.Child.main(Child.java:249)attempt_201310041509_0017_m_000000_0: Server failed to authenticate. Exiting13/10/04 22:29:32 INFO mapred.JobClient: Task Id : attempt_201310041509_0017_m_000001_0, Status : FAILEDjava.io.IOExceptionat org.apache.hadoop.mapred.pipes.OutputHandler.waitForAuthentication(OutputHandler.java:188)at org.apache.hadoop.mapred.pipes.Application.waitForAuthentication(Application.java:194)at org.apache.hadoop.mapred.pipes.Application.<init>(Application.java:149)at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:68)at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)at org.apache.hadoop.mapred.Child$4.run(Child.java:255)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:416)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)at org.apache.hadoop.mapred.Child.main(Child.java:249)attempt_201310041509_0017_m_000001_0: Server failed to authenticate. Exiting13/10/04 22:29:40 INFO mapred.JobClient: Task Id : attempt_201310041509_0017_m_000000_1, Status : FAILEDjava.io.IOExceptionat org.apache.hadoop.mapred.pipes.OutputHandler.waitForAuthentication(OutputHandler.java:188)at org.apache.hadoop.mapred.pipes.Application.waitForAuthentication(Application.java:194)at org.apache.hadoop.mapred.pipes.Application.<init>(Application.java:149)at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:68)at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)at org.apache.hadoop.mapred.Child$4.run(Child.java:255)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:416)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)at org.apache.hadoop.mapred.Child.main(Child.java:249)attempt_201310041509_0017_m_000000_1: Server failed to authenticate. Exiting13/10/04 22:29:40 INFO mapred.JobClient: Task Id : attempt_201310041509_0017_m_000001_1, Status : FAILEDjava.io.IOExceptionat org.apache.hadoop.mapred.pipes.OutputHandler.waitForAuthentication(OutputHandler.java:188)at org.apache.hadoop.mapred.pipes.Application.waitForAuthentication(Application.java:194)at org.apache.hadoop.mapred.pipes.Application.<init>(Application.java:149)at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:68)at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)at org.apache.hadoop.mapred.Child$4.run(Child.java:255)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:416)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)at org.apache.hadoop.mapred.Child.main(Child.java:249)attempt_201310041509_0017_m_000001_1: Server failed to authenticate. Exiting13/10/04 22:29:48 INFO mapred.JobClient: Task Id : attempt_201310041509_0017_m_000000_2, Status : FAILEDjava.io.IOExceptionat org.apache.hadoop.mapred.pipes.OutputHandler.waitForAuthentication(OutputHandler.java:188)at org.apache.hadoop.mapred.pipes.Application.waitForAuthentication(Application.java:194)at org.apache.hadoop.mapred.pipes.Application.<init>(Application.java:149)at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:68)at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)at org.apache.hadoop.mapred.Child$4.run(Child.java:255)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:416)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)at org.apache.hadoop.mapred.Child.main(Child.java:249)attempt_201310041509_0017_m_000000_2: Server failed to authenticate. Exiting13/10/04 22:29:48 INFO mapred.JobClient: Task Id : attempt_201310041509_0017_m_000001_2, Status : FAILEDjava.io.IOExceptionat org.apache.hadoop.mapred.pipes.OutputHandler.waitForAuthentication(OutputHandler.java:188)at org.apache.hadoop.mapred.pipes.Application.waitForAuthentication(Application.java:194)at org.apache.hadoop.mapred.pipes.Application.<init>(Application.java:149)at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:68)at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)at org.apache.hadoop.mapred.Child$4.run(Child.java:255)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:416)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)at org.apache.hadoop.mapred.Child.main(Child.java:249)attempt_201310041509_0017_m_000001_2: Server failed to authenticate. Exiting13/10/04 22:29:59 INFO mapred.JobClient: Job complete: job_201310041509_001713/10/04 22:29:59 INFO mapred.JobClient: Counters: 713/10/04 22:29:59 INFO mapred.JobClient:   Job Counters 13/10/04 22:29:59 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=6541613/10/04 22:29:59 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=013/10/04 22:29:59 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=013/10/04 22:29:59 INFO mapred.JobClient:     Launched map tasks=813/10/04 22:29:59 INFO mapred.JobClient:     Data-local map tasks=813/10/04 22:29:59 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=013/10/04 22:29:59 INFO mapred.JobClient:     Failed map tasks=113/10/04 22:29:59 INFO mapred.JobClient: Job Failed: # of failed Map Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask: task_201310041509_0017_m_000000Exception in thread "main" java.io.IOException: Job failed!at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1327)at org.apache.hadoop.mapred.pipes.Submitter.runJob(Submitter.java:248)at org.apache.hadoop.mapred.pipes.Submitter.run(Submitter.java:479)at org.apache.hadoop.mapred.pipes.Submitter.main(Submitter.java:494)

出现上述问题后,从stackoverflow和一些博客上搜了很多,比较有利用价值的是如下两篇,主要的解决方法就是重新编译生成libhadooppipes.a和libhadooputils.a这两个静态库,然后覆盖原先的静态库。具体方法如下。

参考链接:

http://www.linuxquestions.org/questions/linux-software-2/hadoop-1-0-3-pipes-server-failed-to-authenticate-4175429779/

http://guoyunsky.iteye.com/blog/1709654


1.进入到~/hadoop-1.1.2/src/c++/pipes 目录,运行如下命令:

./configuremake install

但是我在运行./configure时出现了如下问题:
xxl@xxl-pc:~/hadoop-1.1.2/src/c++/pipes$ ./configure checking for a BSD-compatible install... /usr/bin/install -cchecking whether build environment is sane... yeschecking for gawk... nochecking for mawk... mawkchecking whether make sets $(MAKE)... yeschecking for style of include used by make... GNUchecking for gcc... gccchecking whether the C compiler works... yeschecking for C compiler default output file name... a.outchecking for suffix of executables... checking whether we are cross compiling... nochecking for suffix of object files... ochecking whether we are using the GNU C compiler... yeschecking whether gcc accepts -g... yeschecking for gcc option to accept ISO C89... none neededchecking dependency style of gcc... gcc3checking how to run the C preprocessor... gcc -Echecking for grep that handles long lines and -e... /bin/grepchecking for egrep... /bin/grep -Echecking for ANSI C header files... yeschecking for sys/types.h... yeschecking for sys/stat.h... yeschecking for stdlib.h... yeschecking for string.h... yeschecking for memory.h... yeschecking for strings.h... yeschecking for inttypes.h... yeschecking for stdint.h... yeschecking for unistd.h... yeschecking minix/config.h usability... nochecking minix/config.h presence... nochecking for minix/config.h... nochecking whether it is safe to define __EXTENSIONS__... yeschecking for special C compiler options needed for large files... nochecking for _FILE_OFFSET_BITS value needed for large files... 64checking pthread.h usability... yeschecking pthread.h presence... yeschecking for pthread.h... yeschecking for pthread_create in -lpthread... yeschecking for HMAC_Init in -lssl... no./configure: line 413: test: please: integer expression expected./configure: line 416: $4: Bad file descriptorconfigure: error: check./configure: line 302: return: please: numeric argument required./configure: line 312: exit: please: numeric argument required

没学过shell,但可以摸索着去改configure的内容,根据错误提示,定位到413行和416行,代码如下:
as_fn_error (){  as_status=$1; test $as_status -eq 0 && as_status=1  if test "$4"; then    as_lineno=${as_lineno-"$3"} as_lineno_stack=as_lineno_stack=$as_lineno_stack    $as_echo "$as_me:${as_lineno-$LINENO}: error: $2" >&$4  fi  $as_echo "$as_me: error: $2" >&2  as_fn_exit $as_status} # as_fn_error

出现错误后,会调用 as_fn_exit 方法,退出脚本程序,我在这里把这一行注释了,即 出现错误,但不退出,继续执行脚本:
#as_fn_exit $as_status
这种方法固然不好,但花了很长时间只能这样做了。
重新运行:
./configuremake install
会在~/hadoop-1.1.2/src/c++/install 文件夹下面看到新生成了相关的.h文件和.a文件

2. 类似的,在~/hadoop-1.1.2/src/c++/utils 文件夹中运行:

./configuremake install
并没有出现pipe文件夹中出现的问题。


生成好新的libhadooppipes.a和libhadooputils.a这两个静态库和相关的头文件之后,将这些文件覆盖到~/hadoop-1.1.2/c++/Linux-i386-32/ 文件夹中的include目录和lib目录中去。(据上述两篇文章说要)重启hadoop,然后重新运行C++程序。

最终运行成功,没有出现问题。