hadoop错误java.io.IOException at org.apache.hadoop.mapred.pipes.OutputHandler。。。

来源:互联网 发布:淘宝订单自检不清洗 编辑:程序博客网 时间:2024/05/17 08:52
错误日志
12/09/30 18:35:26 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).12/09/30 18:35:26 INFO util.NativeCodeLoader: Loaded the native-hadoop library12/09/30 18:35:26 WARN snappy.LoadSnappy: Snappy native library not loaded12/09/30 18:35:26 INFO mapred.FileInputFormat: Total input paths to process : 112/09/30 18:35:27 INFO mapred.JobClient: Running job: job_201209301832_000112/09/30 18:35:28 INFO mapred.JobClient:  map 0% reduce 0%12/09/30 18:35:40 INFO mapred.JobClient: Task Id : attempt_201209301832_0001_m_000000_0, Status : FAILEDjava.io.IOException        at org.apache.hadoop.mapred.pipes.OutputHandler.waitForAuthentication(OutputHandler.java:188)        at org.apache.hadoop.mapred.pipes.Application.waitForAuthentication(Application.java:194)        at org.apache.hadoop.mapred.pipes.Application.<init>(Application.java:149)        at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:68)        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)        at java.security.AccessController.doPrivileged(Native Method)        at javax.security.auth.Subject.doAs(Subject.java:415)        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)        at org.apache.hadoop.mapred.Child.main(Child.java:249)attempt_201209301832_0001_m_000000_0: Server failed to authenticate. Exiting

解决方案:
Hello everybody!

After some googling and trial-and-error, i finally found the solution to this one. I suspect that other people may also come across this problem, so i'm posting it here.

I use Hadoop-1.0.3 (the tar.gz package, not the .deb or .rpm)
In my program's Makefile i was initially using the libraries in $(HADOOP_INSTALL)/c++/Linux-amd64-64/
I actually had to recompile these from source -with a couple of tweaks before- and include the new ones instead.

So, first of all, since i'm running Slackware64 14.0, I enabled the multilib support.

Then

1. Export a variable LIB=-lcrypto. (I actually put it in /etc/profile, so that i don't have to export it every time).

2. in $(HADOOP_INSTALL)/src/c++/pipes/impl/HadoopPipes.cc add
Code:
#include <unistd.h>
3. In $(HADOOP_INSTALL)/src/contrib/gridmix/src/java/org/apache/hadoop/mapred/gridmix/Gridmix.java, replace the two(2) lines described here.

4. In $(HADOOP_INSTALL)/src/c++/utils run
Code:
./configuremake install
5. In $(HADOOP_INSTALL)/src/c++/pipes run
Code:
./configuremake install
6. In the new Makefile, use
Code:
-I$(HADOOP_INSTALL)/src/c++/install/include-L$(HADOOP_INSTALL)/src/c++/install/lib -lhadooputils -lhadooppipes -lcrypto -lssl -lpthread
That was it. Programs runs fine now.
原创粉丝点击