hadoop学习5-HDFS API学习
来源:互联网 发布:太极越狱mac 编辑:程序博客网 时间:2024/05/20 15:11
复习上一次课总结
1)HDFS读过程
DistributedFileSystem =>FSDataInputStream =>DFSClient.open(RPC通信机制)=>NN.open
2)HDFS写过程
DistributeFileSystem => FSDataOutputStream => DFSClient.create(RPC通信机制)=>NN.create
3)SecondaryNamenode的作用与机制
SNN不是完全意义上的NN的备份
拉取Fsimage和edits文件的SNN的内存中进行合并
fs.checkpoint.period
fs.checkpoint.size
fs.checkpoint.dir
hadoop2.x以及hadoop0.23后
checkpoint node 与secondaryNamenode完全一样
backup node 完全意义上的namenode备份
4)一旦丢失NN或者元数据信息,可以通过SNN检查点目录恢复元数据信息
hadoop namenode -importCheckpoint 导入检查点信息
hadoop-daemon.sh start namenode 启动namenode
5)机架感知
默认情况下所有DN被认为是出于同一个机架上,不管是否物理上属于同一个机架
/default-rack
topology.script.file.name属性值是一个脚本文件可以是python或shell,这个脚本里面写的是真正意义上的网络拓扑结构图
/d1/rack1/dn1
HDFS API
地址:hadoop.apache.org/docs/current1/api
下面通过一些testcase来展示HDFS API部分代码
public static String hdfsUrl = "hdfs://192.168.1.201:9100";//create HDFS folder@Testpublic void testHDFSMKdir() throws IOException{Configuration conf = new Configuration();FileSystem fs = FileSystem.get(URI.create(hdfsUrl),conf);Path path = new Path("/test");fs.mkdirs(path);}//create HDFS file@Testpublic void testCreateFile() throws IOException{Configuration conf = new Configuration();FileSystem fs = FileSystem.get(URI.create(hdfsUrl), conf);Path path = new Path("/test/a.txt");FSDataOutputStream out = fs.create(path);out.write("hello hadoop!".getBytes());}//rename a file name@Testpublic void testRenameFile() throws IOException{Configuration conf = new Configuration();FileSystem fs = FileSystem.get(URI.create(hdfsUrl), conf);Path path = new Path("/test/a.txt");Path newPath = new Path("/test/b.txt");System.out.print(fs.rename(path, newPath));}//upload a local file@Testpublic void testUploadLocalFile1() throws IOException{Configuration conf = new Configuration();FileSystem fs =FileSystem.get(URI.create(hdfsUrl), conf);Path src = new Path("/home/hadoop-1.2.1/bin/rcc");Path dst = new Path("/test");fs.copyFromLocalFile(src, dst);}//other way upload a local file@Testpublic void testUploadLocalFile2() throws IOException{Configuration conf = new Configuration();FileSystem fs =FileSystem.get(URI.create(hdfsUrl), conf);InputStream in = new BufferedInputStream(new FileInputStream(new File("/home/hadoop-1.2.1/bin/rcc")));FSDataOutputStream out = fs.create(new Path("/test/rcc"));IOUtils.copyBytes(in, out, 1024);}//other way upload a local file@Testpublic void testUploadLocalFile3() throws IOException{Configuration conf = new Configuration();FileSystem fs =FileSystem.get(URI.create(hdfsUrl), conf);InputStream in = new BufferedInputStream(new FileInputStream(new File("/home/hadoop-1.2.1/bin/rcc")));FSDataOutputStream out = fs.create(new Path("/test/data"),new Progressable(){@Overridepublic void progress(){System.out.println(".");}});IOUtils.copyBytes(in, out, 1024);}//List files under folder@Testpublic void testListFiles() throws IOException{Configuration conf = new Configuration();FileSystem fs = FileSystem.get(URI.create(hdfsUrl),conf);Path dst = new Path("/test");FileStatus[] files = fs.listStatus(dst);for(FileStatus file:files){System.out.println(file.getPath().toString());}}//List block under folder@Testpublic void testGetBlockInfo() throws IOException{Configuration conf = new Configuration();FileSystem fs = FileSystem.get(URI.create(hdfsUrl),conf);Path dst = new Path("/test/data");FileStatus fileStatus = fs.getFileStatus(dst);BlockLocation[] blkLoc = fs.getFileBlockLocations(fileStatus, 0, fileStatus.getLen());for(BlockLocation loc:blkLoc){for(int i = 0;i<loc.getHosts().length;i++){System.out.println(loc.getHosts()[i]);}}}
- hadoop学习5-HDFS API学习
- Hadoop学习(3)----HDFS API
- Hadoop学习之HDFS
- hadoop学习--HDFS
- Hadoop中的HDFS学习
- hadoop-hdfs学习笔记
- hadoop-hdfs学习1
- hadoop-hdfs学习2
- 基于ECLIPSE的HADOOP开发-----HDFS API学习
- 基于eclipse的hadoop开发-----HDFS API学习
- Hadoop学习笔记(3)-java操作hdfs的API接口
- hadoop学习笔记--5.HDFS的java api接口访问
- Hadoop学习二(java api调用操作HDFS)
- Hadoop学习三(java api 对hdfs常用操作)
- hadoop学习记录(二)HDFS java api
- Hadoop学习笔记:HDFS的java API使用
- hadoop学习--HDFS详细学习
- Hadoop学习笔记之---HDFS
- 关于手机和各种pad的定位
- EOJ2562
- poj 2366 Sacrament of the sum 尺取法的灵活运用
- 数据挖掘过程模型研究
- 晶振两端串联和并联电阻的租用
- hadoop学习5-HDFS API学习
- DICOM医学图像处理:DIMSE消息发送与接收“大同小异”之DCMTK fo-dicom mDCM
- primary key与unique
- 学习笔记1-Metro UI
- CheckBox的API
- 我的高精度加法模板
- OC语法<3.1> OC中特有的语法:Category分类
- 数据库连接池-C3P0
- 蓝牙Profile的概念和常见种类(转)