hadoop:RemoteException
来源:互联网 发布:c语言随机数生成的代码 编辑:程序博客网 时间:2024/06/06 13:47
启动Hadoop集群测试HBase时候,发现三台DataNode只启动成功了两台,未启动成功的那一台日志中出现了下列异常:
写道
2012-09-07 23:58:51,240 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: DataNode is shutting down: org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.hdfs.protocol.UnregisteredDatanodeException: Data node 192.168.100.11:50010 is attempting to report storage ID DS-1282452139-218.196.207.181-50010-1344220553439. Node 192.168.100.12:50010 is expected to serve this storage.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getDatanode(FSNamesystem.java:4608)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.processReport(FSNamesystem.java:3460)
at org.apache.hadoop.hdfs.server.namenode.NameNode.blockReport(NameNode.java:1001)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
at org.apache.hadoop.ipc.Client.call(Client.java:1070)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
at $Proxy5.blockReport(Unknown Source)
at org.apache.hadoop.hdfs.server.datanode.DataNode.offerService(DataNode.java:958)
at org.apache.hadoop.hdfs.server.datanode.DataNode.run(DataNode.java:1458)
at java.lang.Thread.run(Thread.java:722)
此异常是因为,两台DataNode的storageID出现了冲突,应该是因为我直接备份安装的原因吧。解决方法就是直接将出现异常的那台机器的data目录删除!
写道
2012-09-07 23:58:51,240 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: DataNode is shutting down: org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.hdfs.protocol.UnregisteredDatanodeException: Data node 192.168.100.11:50010 is attempting to report storage ID DS-1282452139-218.196.207.181-50010-1344220553439. Node 192.168.100.12:50010 is expected to serve this storage.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getDatanode(FSNamesystem.java:4608)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.processReport(FSNamesystem.java:3460)
at org.apache.hadoop.hdfs.server.namenode.NameNode.blockReport(NameNode.java:1001)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
at org.apache.hadoop.ipc.Client.call(Client.java:1070)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
at $Proxy5.blockReport(Unknown Source)
at org.apache.hadoop.hdfs.server.datanode.DataNode.offerService(DataNode.java:958)
at org.apache.hadoop.hdfs.server.datanode.DataNode.run(DataNode.java:1458)
at java.lang.Thread.run(Thread.java:722)
此异常是因为,两台DataNode的storageID出现了冲突,应该是因为我直接备份安装的原因吧。解决方法就是直接将出现异常的那台机器的data目录删除!
0 0
- hadoop:RemoteException
- org.apache.hadoop.ipc.RemoteException(java.io.IOException)
- Hadoop异常 hdfs.DFSClient: DataStreamer Exception: org.apache.hadoop.ipc.RemoteException
- Hadoop异常 hdfs.DFSClient: DataStreamer Exception: org.apache.hadoop.ipc.RemoteException
- hadoop拷贝文件时 org.apache.hadoop.ipc.RemoteException异常的解决
- org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.hdfs.protocol.AlreadyBeingCreatedException:
- hbase异常org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hbase.ClockOutOfSyncException): org.a
- Hbase报错"org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation ca"
- hadoop上传文件错误org.apache.hadoop.ipc.RemoteException(java.io.IOException)
- Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException):
- Oozie Error: E0902 : E0902: Exception occured: [org.apache.hadoop.ipc.RemoteException: User: oozie i
- org.apache.hadoop.ipc.RemoteException: User: root is not allowed to impersonate root
- org.apache.hadoop.ipc.RemoteException: User: root is not allowed to impersonate root
- hdfs.DFSClient: DataStreamer Exception: org.apache.hadoop.ipc.RemoteException: java.io.IOException:
- MetaException(message:Got exception: org.apache.hadoop.ipc.RemoteException Cannot create directory /
- hbase错误:Org.apache.hadoop.ipc.RemoteException:User:client is not allowed to impersonate root
- 一脸懵逼加从入门到绝望学习hadoop之 org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlE
- hive执行query语句时提示错误:org.apache.hadoop.ipc.RemoteException: java.io.IOException: java.io.IOException:
- eclipse项目发布路径
- 学习ThinkPHP3.2.2:video11,表情字符替换为图片文件
- error:server IPC version 9 cannot communicate with client version 4
- 天地图专题一:加载天地图
- flume error
- hadoop:RemoteException
- 使用Ajax中的Json传输数据时,从后台查询的字段过滤方法。
- hadoop本地库与系统版本不一致引起的错误解决方法
- 动态 SQL、EXECUTE IMMEDIATE、using、into、returning
- hive +hbase 得不到返回结果
- 学习ThinkPHP3.2.2:video11,ajaxReturn函数向客户端返回ajax调用后的数据
- hive 报错FAILED: Error in metadata: java.lang.RuntimeException: Unable to instantiate org.apache.hadoo
- hive错误 show tables 无法使用 : Unable to instantiate rg.apache.hadoop.hive.metastore.HiveMetaStoreClient
- hosts文件不显示