Hadoop的distcp命令出现Permission denied错误
来源:互联网 发布:吴知力图片 编辑:程序博客网 时间:2024/04/29 22:51
Hadoop的distcp命令可以实现将文件从一个hdfs文件系统中拷贝到另外一个文件系统中,如下所示:
$ bin/hadoop distcp -overwrite hdfs://123.123.23.111:9000/hsd/t_url hdfs://123.123.23.156:9000/data/t_url正常情况下应该出现如下运行结果:
Java HotSpot(TM) 64-Bit Server VM warning: Insufficient space for shared memory file: /tmp/hsperfdata_hugetable/16744Try using the -Djava.io.tmpdir= option to select an alternate temp location.15/04/29 20:35:07 INFO tools.DistCp: Input Options: DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false, ignoreFailures=false, maxMaps=20, sslConfigurationFile='null', copyStrategy='uniformsize', sourceFileListing=null, sourcePaths=[hdfs://192.168.34.135:9000/zyx/t_url], targetPath=hdfs://192.168.34.156:9000/data/t_url, targetPathExists=false, preserveRawXattrs=false}15/04/29 20:35:07 INFO client.RMProxy: Connecting to ResourceManager at compute-23-06.local/192.168.34.135:803215/04/29 20:35:08 INFO Configuration.deprecation: io.sort.mb is deprecated. Instead, use mapreduce.task.io.sort.mb15/04/29 20:35:08 INFO Configuration.deprecation: io.sort.factor is deprecated. Instead, use mapreduce.task.io.sort.factor15/04/29 20:35:08 WARN conf.Configuration: bad conf file: element not <property>15/04/29 20:35:08 WARN conf.Configuration: bad conf file: element not <property>15/04/29 20:35:08 INFO client.RMProxy: Connecting to ResourceManager at compute-23-06.local/192.168.34.135:803215/04/29 20:35:09 INFO mapreduce.JobSubmitter: number of splits:2115/04/29 20:35:10 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1429262156603_003215/04/29 20:35:10 INFO impl.YarnClientImpl: Submitted application application_1429262156603_003215/04/29 20:35:10 INFO mapreduce.Job: The url to track the job: http://compute-23-06.local:8088/proxy/application_1429262156603_0032/15/04/29 20:35:10 INFO tools.DistCp: DistCp job-id: job_1429262156603_003215/04/29 20:35:10 INFO mapreduce.Job: Running job: job_1429262156603_003215/04/29 20:35:21 INFO mapreduce.Job: Job job_1429262156603_0032 running in uber mode : false15/04/29 20:35:21 INFO mapreduce.Job: map 0% reduce 0%15/04/29 20:35:32 INFO mapreduce.Job: map 10% reduce 0%15/04/29 20:35:33 INFO mapreduce.Job: map 18% reduce 0%15/04/29 20:35:34 INFO mapreduce.Job: map 25% reduce 0%……
但是我在运行的过程中出现了Permission denied错误,具体如下:
Error: java.io.IOException: File copy failed: hdfs://192.168.34.135:9000/zyx/t_url/000031_0 --> hdfs://192.168.34.156:9000/data/000031_0at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:284)at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:252)at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:50)at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:415)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)Caused by: java.io.IOException: Couldn't run retriable-command: Copying hdfs://192.168.34.135:9000/zyx/t_url/000031_0 to hdfs://192.168.34.156:9000/data/000031_0at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:280)... 10 moreCaused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=hugetable, access=WRITE, inode="/data":root:supergroup:drwxr-xr-xat org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:271)at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:257)at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:238)at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:179)at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5584)at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5566)at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:5540)at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2282)at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2235)at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2188)at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:505)at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:354)at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)从命令中可以很明显的看出来,是权限不够,“supergroup:drwxr-x-r-x”,通过hadoop fs -ls /来查看,可见对于组外的其他用户的确没有写权限(drwxr-xr-x的最后3个是组外的其他用户权限,目前只有r:read和x:executable两个权限)。可见用户对hdfs文件没有写权限。
可以通过chmod 来修改权限:
具体如下:
$ hadoop fs -chmod 777 /lgh$ hadoop fs -chmod 777 /data上述代码是给hdfs文件/lgh和/data两个文件夹赋予777权限(777权限即所有用户的所有权限),运行完了再次执行hadoop fs /可见权限已经全部赋予上去了。
全文完,转载请注明出处:http://blog.csdn.net/ghuil/article/details/45372469
1 1
- Hadoop的distcp命令出现Permission denied错误
- hadoop distcp命令的使用
- Hadoop之HDFS客户端的权限错误:Permission denied
- 安装hive出现的错误“hive: Permission denied”
- linux下出现connect failed: Permission denied错误的解决办法
- Hadoop distcp命令
- hadoop命令distcp注意事项
- hadoop distcp 命令
- hadoop命令distcp注意事项
- hadoop distcp 命令
- hadoop命令distcp注意事项
- 出现Permission denied的解决办法
- 解决travis出现./gradlew: Permission denied错误
- git出现错误:Permission denied (publickey).解决方法
- 安装oracle10g出现“Permission denied”错误
- Linux出现Bash . configure permission denied错误
- Nginx的Permission denied错误
- Win系统下用Eclipse中运行远程hadoop MapReduce程序出现Permission denied错误
- 剑指offer--面试题6:重建二叉树--Java实现
- hdu1561 树形背包初探
- rails 禁止打印出assets的日志请求
- SQL的内连接与外连接
- hive性能调优
- Hadoop的distcp命令出现Permission denied错误
- [Windows]端口转发技巧
- 对C++中函数模板的认识
- Object-c集合的简单介绍
- 每天三道冲刺工作
- qt视图显示不同数据
- localtime函数与localtime_r函数区别
- php配置php-fpm启动参数及配置详解
- hdu 3584 Cube