hadoop - hdfs base operation (Java api )
来源:互联网 发布:压力测试软件 编辑:程序博客网 时间:2024/06/05 12:30
package com.billstudy.hdfs.test;import java.io.ByteArrayInputStream;import java.io.IOException;import java.net.URI;import java.net.URISyntaxException;import java.net.URL;import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.fs.FSDataInputStream;import org.apache.hadoop.fs.FSDataOutputStream;import org.apache.hadoop.fs.FileStatus;import org.apache.hadoop.fs.FileSystem;import org.apache.hadoop.fs.FsUrlStreamHandlerFactory;import org.apache.hadoop.fs.Path;import org.apache.hadoop.io.IOUtils;import org.junit.Before;import org.junit.Test;/** * Test hdfs operation (CRUD) * @author Bill * @since V1.0 2015年3月22日 - 上午9:45:40 */public class HadoopJunitTest {private final String HDFS_BASE_PATH = "hdfs://h21:9000/";private FileSystem fs = null;private final Configuration configuration = new Configuration();@Beforepublic void before() throws Exception{// register supports hdfs protocol URL.setURLStreamHandlerFactory(new FsUrlStreamHandlerFactory());// create file system by uri and confURI uri = new URI(HDFS_BASE_PATH);fs = FileSystem.get(uri, configuration);}@Testpublic void testMkdir() throws IOException, URISyntaxException{final Path path = new Path("/hey");// if not exists create boolean exists = fs.exists(path);System.out.println(path.getName() + "-exists:" + exists);if (!exists) {fs.mkdirs(path); }fs.close();}@Testpublic void testPut() throws Exception{FSDataOutputStream outSteam = fs.create(new Path("/hey/hi-hadoop.txt"), true);// inputSteam , outputStream, configuration , copy success is closeIOUtils.copyBytes(new ByteArrayInputStream("hello hadoop ~".getBytes()),outSteam, configuration, true);fs.close();}@Testpublic void testGet() throws Exception{FSDataInputStream inSteam = fs.open(new Path("/hey/hi-hadoop.txt"));// read file print to consoleIOUtils.copyBytes(inSteam, System.out, configuration, true);fs.close();}@Testpublic void testListFile() throws Exception{FileStatus[] listStatus = fs.listStatus((new Path("/")));for (FileStatus f : listStatus) {System.out.println( ( f.isDir() ? "dir" : "file" ) + "\t"+ ( f.getAccessTime() ) + "\t"+ ( f.getBlockSize() ) + "\t"+ ( f.getGroup() ) + "\t"+ ( f.getLen() ) + "\t"+ ( f.getModificationTime() ) + "\t"+ ( f.getReplication() ) + "\t"+ ( f.getPermission() ) + "\t"+ ( f.getPath().getName() ) + "\t");}}@Testpublic void testDelete() throws IOException{Path path = new Path("/hey");// recursive delete , likeness shell rmr //fs.delete(path, true);fs.deleteOnExit(path);}}
0 0
- hadoop - hdfs base operation (Java api )
- JAVA操作HDFS API(hadoop)
- JAVA操作HDFS API(hadoop)
- JAVA操作HDFS API(hadoop) HDFS API详解
- hadoop hdfs java api 文件操作类
- hadoop hdfs java api 文件操作类
- hadoop hdfs java api操作实战
- Java调用Hadoop HDFS API编程
- Linux下Hadoop hdfs Java API使用
- Linux下Hadoop hdfs Java API使用
- Linux下Hadoop hdfs Java API使用
- Hadoop-利用java API操作HDFS文件
- Hadoop HDFS 的 Java API 操作方式
- 使用Hadoop的Java API操作HDFS
- hadoop之HDFS:通过Java API访问HDFS
- hadoop hdfs API操作
- hadoop-2 HDFS API
- hadoop: hdfs API示例
- 贪心+背包详解
- Android开发学习路线
- SqlServer在视图上创建索引
- 调整/etc/sysctl.conf网络参数提高系统负载
- openGL鼠标拖动使得物体旋转
- hadoop - hdfs base operation (Java api )
- 云计算、Hadoop小记
- 第二周编程小结
- HDU-2410 Barbara Bennett's Wild Numbers 数学题
- Java_多线程(上)
- 底层之Spring.net AOP
- OC学习小结之Foudation -NSMutableArray
- iOS中使用block进行网络请求回调 - xuym
- 第二周【项目四—图书馆的书】