windows下的eclipse连接hadoop集群中遇到的问题

来源:互联网 发布:知乎段子手 编辑:程序博客网 时间:2024/06/03 23:38

前提条件:
虚拟机上已部署好hadoop集群(hadoop-2.5.0)
在网上下载hadoop-eclipse-plugin-2.5.0.保存至eclipse\plugins中
配置步骤:
1 window->preferences,添加hadoop的解压包所在的路径
这里写图片描述
2 window->Open perspective->other->map/reduce->ok
这里写图片描述
右键->new hadoop location
这里写图片描述
map/reduce Master port所对应的参数是
yarn-site.xml中yarn.resourcemanager.scheduler.address
参数默认值是8030
DFS master port 所对应的参数是
hdfs-site.xml中dfs.namenode.rpc-address
port默认值是8020
完成后点击finish即可

遇到的问题:
Eclipse远程连接hadoop时 报 Permission denied
解决

<property>    <name>dfs.permissions.enabled</name>    <value>false</value>    <description>       If "true", enable permission checking in HDFS.       If "false", permission checking is turned off,       but all other behavior is unchanged.       Switching from one parameter value to the other does not change the mode,       owner or group of files or directories.    </description> </property>

测试运行(以wordcount程序为例)
运行时报错:

log4j:WARN No appenders could be found for logger (org.apache.hadoop.util.Shell).log4j:WARN Please initialize the log4j system properly.log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.Exception in thread "main" java.lang.NullPointerException    at java.lang.ProcessBuilder.start(ProcessBuilder.java:1010)    at org.apache.hadoop.util.Shell.runCommand(Shell.java:482)    at org.apache.hadoop.util.Shell.run(Shell.java:455)    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:702)    at org.apache.hadoop.util.Shell.execCommand(Shell.java:791)    at org.apache.hadoop.util.Shell.execCommand(Shell.java:774)    at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:646)    at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:434)    at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:281)    at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:125)    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:348)    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)    at java.security.AccessController.doPrivileged(Native Method)    at javax.security.auth.Subject.doAs(Subject.java:415)    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)    at WordCount.main(WordCount.java:70)

上面的提示信息是log4j没有配置导致的。但是出现上面的错误,是因为在Windows下运行Map/Reduce程序需要winutils.exe和hadoop.dll的支持,选择32位或者64位的,然后拷贝上面两个文件放到HADOOP_HOME/bin目录下,然后拷贝hadoop.dll到C:\Windows\System32目录下,做完上面的工作时候,再次运行WordCount。

0 0
原创粉丝点击