hadoop fs -put Exception in thread "main" java.lang.UnsatisfiedLink

来源:互联网 发布:微信专用淘宝二维码 编辑:程序博客网 时间:2024/05/22 12:32

在Hadoop put文件到HDFS上遇到这个错误,具体为:

Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSumsByteArray(II[BI[BIILjava/lang/String;JZ)V    at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSumsByteArray(Native Method)    at org.apache.hadoop.util.NativeCrc32.calculateChunkedSumsByteArray(NativeCrc32.java:86)    at org.apache.hadoop.util.DataChecksum.calculateChunkedSums(DataChecksum.java:430)    at org.apache.hadoop.fs.FSOutputSummer.writeChecksumChunks(FSOutputSummer.java:202)    at org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:163)    at org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:144)    at org.apache.hadoop.hdfs.DFSOutputStream.close(DFSOutputStream.java:2217)    at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:72)    at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:106)    at org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:237)    at org.apache.hadoop.io.IOUtils.closeStream(IOUtils.java:254)    at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:61)    at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:112)    at org.apache.hadoop.fs.shell.CommandWithDestination$TargetFileSystem.writeStreamToFile(CommandWithDestination.java:466)    at org.apache.hadoop.fs.shell.CommandWithDestination.copyStreamToTarget(CommandWithDestination.java:391)    at org.apache.hadoop.fs.shell.CommandWithDestination.copyFileToTarget(CommandWithDestination.java:328)    at org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:263)    at org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:248)    at org.apache.hadoop.fs.shell.Command.processPaths(Command.java:306)    at org.apache.hadoop.fs.shell.Command.processPathArgument(Command.java:278)    at org.apache.hadoop.fs.shell.CommandWithDestination.processPathArgument(CommandWithDestination.java:243)    at org.apache.hadoop.fs.shell.Command.processArgument(Command.java:260)    at org.apache.hadoop.fs.shell.Command.processArguments(Command.java:244)    at org.apache.hadoop.fs.shell.CommandWithDestination.processArguments(CommandWithDestination.java:220)    at org.apache.hadoop.fs.shell.CopyCommands$Put.processArguments(CopyCommands.java:267)    at org.apache.hadoop.fs.shell.Command.processRawArguments(Command.java:190)    at org.apache.hadoop.fs.shell.Command.run(Command.java:154)    at org.apache.hadoop.fs.FsShell.run(FsShell.java:287)    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)    at org.apache.hadoop.fs.FsShell.main(FsShell.java:340)

我上网查阅了一遍,基本上问的都是说window上连接Hadoop的问题,但是我这是hadoop put 文件都不行。

在百度知道那也仅有一个问这样的错误,可是他是hdfs路径写错,我重新仔细布置了一遍环境,mkdir是成功的,但是put是不行的。

我个人原因是:
1、jre的lib中的libhadoop.so, libhdfs.so 与当前hadoop版本 不对应

处理:
把 /hadoop-2.x.x/lib/native下 libhadoop.so、libhadoop.so.1.0.0、libhdfs.so、libhdfs.so.0.0.0复制到
安装java的目录下jre/lib/amd64
我个人是/usr/lib/java/jdk1.8.0_51/jre/lib/amd64

最后吐槽一下,这个问题困扰了我一天,可能我太菜鸡,遇到这个异常是因为spark程序的输出抛的,后来发现是整个hdfs不行

0 0
原创粉丝点击