hadoop异常:hdfs.server.datanode.DataNode: Problem connecting to server: localhost/127.0.0.1:8020
来源:互联网 发布:淘宝上的尼康典范店 编辑:程序博客网 时间:2024/06/05 17:19
已经在我机子UBUNTU13.10上面安装了hadoop的伪节点分布,现在我想运行一个最基本的测试程序。
首先我的电脑名字是Tank, 我在这台电脑上的账户名是joe
我的hadoop安装在本地电脑/usr/local/hadoop-2.4.0
我的单机伪分布配置和这篇文章中http://www.csdn123.com/html/itweb/20130801/34361_34373_34414.htm的前1,2,3十一样的,并且在3中,
上文中的dfs.name.dir我是dfs.namenode.name.dir, dfs.data.dir我的是dfs.datanode.data.dir
这是一个连环的故事,调了整整一晚上。为了解决一个问题,我自作聪明的吧/usr/local/hadoop-2.4.0/dfs文件删除。我当时以为当我再次启动hadoop时会自动创建一个。的确,当我重启hadoop时,自动创建了一个dfs,里面含有一个name文件,于是我自己又手动添加了一个data文件。但是在/usr/local/hadoop-2.4.0/logs下的登录日志中总会出现这样的错误
2014-07-13 22:15:52,014 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2014-07-13 22:15:53,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2014-07-13 22:15:54,016 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2014-07-13 22:15:55,016 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2014-07-13 22:15:56,017 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2014-07-13 22:15:57,018 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2014-07-13 22:15:58,019 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2014-07-13 22:15:59,020 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2014-07-13 22:16:00,020 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2014-07-13 22:16:01,021 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2014-07-13 22:16:01,023 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: localhost/127.0.0.1:8020
试了很多方法都不管用,最后发现,只要关闭hadoop,然后进行格式化即可正常。因为我是在一个伪分布,HDFS里面没有任何内容,所以我就对namenode和datanode都进行了格式化
$hadoop namenode -format
$hadoop datanode -format
- hadoop异常:hdfs.server.datanode.DataNode: Problem connecting to server: localhost/127.0.0.1:8020
- HADOOP :WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: xxx/xxx
- Hadoop 错误 Problem connecting to server: localhost/127.0.0.1:9000
- hadoop 启动的时候datanode报错 Problem connecting to server
- hadoop 启动的时候datanode报错 Problem connecting to server
- hadoop 启动的时候datanode报错 Problem connecting to server
- Hadoop Problem : org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: Incompatible
- ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Incompatible namespaceIDs
- Hadoop问题DataNode error (ERROR org.apache.hadoop.hdfs.server.datanode.DataNode)
- 云计算学习笔记---异常处理---hadoop问题处理ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: java.lang.NullPoin
- org.apache.hadoop.hdfs.server.datanode.DataNode: Invalid dfs.datanode.data.dir /chunk : java.io.Fil
- bug宝典hadoop篇 org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block pool <
- hadoopFATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Incompatible build versions: namenode
- ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: Incompatible namespaceID
- org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: Incompatible namespaceIDs in /
- org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: Incompatible namespaceIDs
- org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool <registering> (storage id unknown) servi
- 564 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService java.io.IOExce
- 更新到win8.1后vs2013打开出错的问题解决办法
- POJ题目分类
- 个性化IAAS还是通用型PAAS
- Tokyo Tyrant(TTServer)系列(二)-启动参数和配置
- LeetCode OJ - Balanced Binary Tree
- hadoop异常:hdfs.server.datanode.DataNode: Problem connecting to server: localhost/127.0.0.1:8020
- 便利的开发工具 CppUnit 快速使用指南
- 答读者问(7):有关实习、毕业论文及软件开发和测试的关系等问题
- 持续集成学习记录
- R语言备忘
- java_动态代理
- ARC深入浅出
- java设计模式-单例设计模式
- B-树小结