Hadoop学习二(java api调用操作HDFS)
来源:互联网 发布:mac os sierra 10.13 编辑:程序博客网 时间:2024/06/07 19:35
上一篇文章记录了在Ubuntu虚拟机上启动了hadoop
这篇文章在windows机器上通过java api方式调用操作hdfs,在使用过程中也遇到了一些问题,这里简单介绍一下。
工具:.IntellJ IDEA
首先创建了一个maven项目,pom.xml文件中引入了 hadoop-common 包
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common --><dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-common</artifactId> <version>2.7.1</version></dependency><!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-mapreduce-client-core --><dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-mapreduce-client-core</artifactId> <version>2.7.1</version></dependency>
我测试了下从hdfs上下载文件,过程如下:
在linux的hadoop服务器上面创建了一个文件,文件名是newrt (就是把jdk下面的rt.jar上传到了hdfs)
在windows本地创建一个class--->HDFSTest,代码如下import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.fs.FileSystem;import org.apache.hadoop.fs.Path;import org.apache.hadoop.io.IOUtils;import java.io.FileOutputStream;import java.io.InputStream;import java.io.OutputStream;import java.net.URI;/** * Created by lixintang on 2017/4/17. */public class HDFSTest { public static void main(String[] args) throws Exception{ Configuration conf = new Configuration(); FileSystem fs = FileSystem.get(URI.create("hdfs://192.168.56.101:9000"), conf); InputStream in = fs.open(new Path("/newrt")); OutputStream out = new FileOutputStream("D:/hadoop/hdfs/rt.jar"); IOUtils.copyBytes(in, out, 4096, true); }}
运行报出如下问题:
1 [main] DEBUG org.apache.hadoop.util.Shell - Failed to detect a valid hadoop home directory
java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:303)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:328)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80)
at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2807)
at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2802)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2668)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:371)
at com.xintangli.bigdata.hadooptest.hdfs.HDFSTest.main(HDFSTest.java:21)
6 [main] ERROR org.apache.hadoop.util.Shell - Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:356)
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:371)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:364)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80)
at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2807)
at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2802)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2668)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:371)
at com.xintangli.bigdata.hadooptest.hdfs.HDFSTest.main(HDFSTest.java:21)
看错误信息是HADOOP_HOME没有配置
可以在调用之前设置 hadoop.home.dir(这里是设置本地的hadoop目录)
System.setProperty("hadoop.home.dir", "D:\\soft\\dev_soft\\database\\hadoop\\hadoop-2.7.1");
public class HDFSTest { public static void main(String[] args) throws Exception{ System.setProperty("hadoop.home.dir", "D:\\soft\\dev_soft\\database\\hadoop\\hadoop-2.7.1"); Configuration conf = new Configuration(); FileSystem fs = FileSystem.get(URI.create("hdfs://192.168.56.101:9000"), conf); InputStream in = fs.open(new Path("/newrt")); OutputStream out = new FileOutputStream("D:/hadoop/hdfs/rt.jar"); IOUtils.copyBytes(in, out, 4096, true); }}
之后运行报错:
No FileSystem for scheme: hdfs
经过查资料发现是没有引入hadoop-hdfs包导致的,pom文件内容如下
<dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-hdfs</artifactId> <version>2.7.1</version></dependency>
设置好后,可正常运行,并下载了newrt文件保存到本地为rt.jar
- Hadoop学习二(java api调用操作HDFS)
- JAVA操作HDFS API(hadoop)
- JAVA操作HDFS API(hadoop)
- java调用API操作HDFS
- hadoop学习记录(二)HDFS java api
- Hadoop学习笔记(3)-java操作hdfs的API接口
- Hadoop学习三(java api 对hdfs常用操作)
- Java调用Hadoop HDFS API编程
- JAVA操作HDFS API(hadoop) HDFS API详解
- hadoop hdfs java api 文件操作类
- hadoop hdfs java api 文件操作类
- hadoop hdfs java api操作实战
- Hadoop-利用java API操作HDFS文件
- Hadoop HDFS 的 Java API 操作方式
- 使用Hadoop的Java API操作HDFS
- hadoop hdfs API操作
- Hadoop学习<二>--HDFS文件系统操作方式
- java操作HDFS------Hadoop学习(3)
- git 分支操作
- Swift 和 Objective-C 混编的 Framework
- 【python编码】UnicodeDecodeError: 'gbk' codec can't decode byte 0xbf in position X
- Java学习中文本文件转为XMl文件
- JVM垃圾回收
- Hadoop学习二(java api调用操作HDFS)
- 用nodejs读取文件并存入excel中
- Tesseract-OCR4.0版本在VS2015上的编译与运行
- tomcat6+memcached+nginx(windows本地)
- java.lang.UnsatisfiedLinkError: dlopen failed: /data/app/com.gkzxhn.prision-2/lib/arm/libosp.so: has
- 2017华为实习生招聘面试经历(IT应用软件 c++)
- Spring Boot入门教程
- MyBatis——XML映射文件—更新(Mapper XML文件——Insert ,Update,delete)
- js实验2.(7)消息框