Hadoop Failed to set permissions of path 错误处理

来源:互联网 发布:网易云无法连接网络 编辑:程序博客网 时间:2024/05/18 19:19

Exception in thread "main" Java.io.IOException: Failed to set permissions of path: \tmp\Hadoop-Administrator\mapred\staging\Administrator-4954228\.staging to 0700
 at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:689)
 at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:662)
 at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:509)
 at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:344)
 at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:189)
 at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:116)
 at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:856)
 at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Unknown Source)
 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
 at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850)
 at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)
 at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)
 at org.apache.nutch.util.NutchJob.waitForCompletion(NutchJob.java:50)
 at org.apache.nutch.crawl.GeneratorJob.run(GeneratorJob.java:191)
 at org.apache.nutch.crawl.Crawler.runTool(Crawler.java:68)
 at org.apache.nutch.crawl.Crawler.run(Crawler.java:152)
 at org.apache.nutch.crawl.Crawler.run(Crawler.java:250)
 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
 at org.apache.nutch.crawl.Crawler.main(Crawler.java:257)

这个是Windows下文件权限问题,在Linux下可以正常运行,不存在这样的问题。

解决方法是,修改/hadoop-1.0.2/src/core/org/apache/hadoop/fs/FileUtil.java里面的checkReturnValue,注释掉即可(有些粗暴,在Window下,可以不用检查):

......privatestaticvoidcheckReturnValue(booleanrv,Filep,FsPermissionpermission)throwsIOException{/** if (!rv) { throw new IOException("Failed to set permissions of path: " + p + " to " + String.format("%04o", permission.toShort())); } **/}......

重新编译打包hadoop-core-1.0.2.jar,替换掉hadoop-1.0.2根目录下的hadoop-core-1.0.2.jar即可。

这里提供一份修改版的hadoop-core-1.0.2-modified.jar文件,替换原hadoop-core-1.0.2.jar即可。

替换之后,刷新项目,设置好正确的jar包依赖,现在再运行WordCountTest,即可。

成功之后,在Eclipse下刷新HDFS目录,可以看到生成了ouput2目录:

 

https://skydrive.live.com/?cid=cf7746837803bc50&id=CF7746837803BC50%211276

0 0
原创粉丝点击