HDFS基本操作
来源:互联网 发布:深圳税友软件 编辑:程序博客网 时间:2024/05/16 00:41
HDFS基本操作
1、查看HDFS 目录:
执行命令:
hadoop fs -ls hdfs://192.168.1.100:9000/
[hadoop@baolibin ~]$ hadoop fs -lshdfs://192.168.1.100:9000/Warning: $HADOOP_HOME is deprecated. Found 1 itemsdrwxr-xr-x - hadoop supergroup 02015-02-15 21:05 /usr[hadoop@baolibin ~]$
2、对HDFS目录递归查看:
执行命令:
hadoop fs -lsr hdfs://192.168.1.100:9000/
[hadoop@baolibin~]$ hadoop fs -lsr hdfs://192.168.1.100:9000/Warning: $HADOOP_HOME is deprecated. drwxr-xr-x - hadoop supergroup 02015-02-15 21:05 /usrdrwxr-xr-x - hadoop supergroup 02015-02-15 21:05 /usr/hadoopdrwxr-xr-x - hadoop supergroup 02015-02-15 21:05 /usr/hadoop/tmpdrwxr-xr-x - hadoop supergroup 02015-02-15 21:24 /usr/hadoop/tmp/mapreddrwx------ - hadoop supergroup 02015-02-15 21:24 /usr/hadoop/tmp/mapred/system-rw------- 1 hadoop supergroup 42015-02-15 21:24 /usr/hadoop/tmp/mapred/system/jobtracker.info[hadoop@baolibin ~]$
3、创建目录:
执行命令:
hadoop fs -mkdir input
[hadoop@baolibin ~]$ hadoop fs -mkdir inputWarning: $HADOOP_HOME is deprecated. [hadoop@baolibin ~]$
创建的目录所在HDFS目录:
[hadoop@baolibin ~]$ hadoop fs -lsWarning: $HADOOP_HOME is deprecated. Found 1 itemsdrwxr-xr-x - hadoop supergroup 02015-02-15 22:09 /user/hadoop/input[hadoop@baolibin ~]$
4、从Linux上传文件到HDFS:
执行命令:
hadoop fs -put /home/hadoop/baozi.txt /user/hadoop/input/
[hadoop@baolibin ~]$ hadoop fs -put/home/hadoop/baozi.txt /user/hadoop/input/Warning: $HADOOP_HOME is deprecated. [hadoop@baolibin ~]$
查看input 目录:
[hadoop@baolibin ~]$ hadoop fs -ls inputWarning: $HADOOP_HOME is deprecated. Found 1 items-rw-r--r-- 1 hadoop supergroup 462015-02-15 22:13 /user/hadoop/input/baozi.txt[hadoop@baolibin ~]$
5、从HDFS下载文件到Linux:
执行命令:
hadoop fs -get /user/hadoop/input/baozi.txt /home/hadoop/xiaobaozi
[hadoop@baolibin ~]$ hadoop fs -get/user/hadoop/input/baozi.txt /home/hadoop/xiaobaoziWarning: $HADOOP_HOME is deprecated. [hadoop@baolibin ~]$
查看:
[hadoop@baolibin ~]$ ll/home/hadoop/xiaobaozi总用量 4-rw-rw-r--. 1 hadoop hadoop 46 2月 15 22:17 baozi.txt[hadoop@baolibin ~]$
6、查看文件内容:
执行命令:
hadoop fs -text input/baozi.txt
[hadoop@baolibin ~]$ hadoop fs -textinput/baozi.txtWarning: $HADOOP_HOME is deprecated. hadoop hello java baozi hbase hive hadoopjava[hadoop@baolibin ~]$
7、删除文件:
执行命令:
hadoop fs -rm input/baozi.txt
[hadoop@baolibin ~]$ hadoop fs -rminput/baozi.txtWarning: $HADOOP_HOME is deprecated. Deletedhdfs://192.168.1.100:9000/user/hadoop/input/baozi.txt[hadoop@baolibin ~]$
查看是否删除:
[hadoop@baolibin ~]$ hadoop fs -ls input/Warning: $HADOOP_HOME is deprecated. [hadoop@baolibin ~]$
8、递归删除文件:
执行命令:
hadoop fs -rmr input
[hadoop@baolibin ~]$ hadoop fs -rmr inputWarning: $HADOOP_HOME is deprecated. Deletedhdfs://192.168.1.100:9000/user/hadoop/input[hadoop@baolibin ~]$
查看是否删除:
[hadoop@baolibin ~]$ hadoop fs -lsWarning: $HADOOP_HOME is deprecated. [hadoop@baolibin ~]$
9、查看所有命令:
执行命令:
hadoop fs
[hadoop@baolibin ~]$ hadoop fsWarning: $HADOOP_HOME is deprecated. Usage: java FsShell [-ls <path>] [-lsr <path>] [-du <path>] [-dus <path>] [-count[-q] <path>] [-mv <src> <dst>] [-cp <src> <dst>] [-rm [-skipTrash] <path>] [-rmr [-skipTrash] <path>] [-expunge] [-put <localsrc> ... <dst>] [-copyFromLocal <localsrc> ... <dst>] [-moveFromLocal <localsrc> ... <dst>] [-get [-ignoreCrc] [-crc] <src> <localdst>] [-getmerge <src> <localdst> [addnl]] [-cat <src>] [-text <src>] [-copyToLocal [-ignoreCrc] [-crc] <src> <localdst>] [-moveToLocal [-crc] <src> <localdst>] [-mkdir <path>] [-setrep [-R] [-w] <rep> <path/file>] [-touchz <path>] [-test -[ezd] <path>] [-stat [format] <path>] [-tail [-f] <file>] [-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...] [-chown [-R] [OWNER][:[GROUP]] PATH...] [-chgrp [-R] GROUP PATH...] [-help [cmd]] Generic options supported are-conf <configuration file> specify an application configuration file-D <property=value> use value for given property-fs <local|namenode:port> specify a namenode-jt <local|jobtracker:port> specify a job tracker-files <comma separated list offiles> specify comma separatedfiles to be copied to the map reduce cluster-libjars <comma separated list ofjars> specify comma separated jarfiles to include in the classpath.-archives <comma separated list ofarchives> specify comma separatedarchives to be unarchived on the compute machines. The general command line syntax isbin/hadoop command [genericOptions][commandOptions] [hadoop@baolibin ~]$
10、查看某个命令用法:
执行命令:
hadoop fs -help ls
[hadoop@baolibin ~]$ hadoop fs -help lsWarning: $HADOOP_HOME is deprecated. -ls <path>: List the contents that match the specifiedfile pattern. If path is not specified, thecontents of /user/<currentUser> will be listed. Directoryentries are of the form dirName (full path)<dir> and file entries are of theform fileName(full path)<r n> size where n is the number ofreplicas specified for the file and size is the size of thefile, in bytes. [hadoop@baolibin ~]$
11、查看HDFS 基本统计信息:
执行命令:
/usr/hadoop/bin/hadoopdfsadmin -report
[hadoop@baolibin current]$/usr/hadoop/bin/hadoop dfsadmin -reportWarning: $HADOOP_HOME is deprecated. Configured Capacity: 7431069696 (6.92 GB)Present Capacity: 2737635328 (2.55 GB)DFS Remaining: 2737594368 (2.55 GB)DFS Used: 40960 (40 KB)DFS Used%: 0%Under replicated blocks: 0Blocks with corrupt replicas: 0Missing blocks: 0 -------------------------------------------------Datanodes available: 1 (1 total, 0 dead) Name: 192.168.1.100:50010Decommission Status : NormalConfigured Capacity: 7431069696 (6.92 GB)DFS Used: 40960 (40 KB)Non DFS Used: 4693434368 (4.37 GB)DFS Remaining: 2737594368(2.55 GB)DFS Used%: 0%DFS Remaining%: 36.84%Last contact: Mon Feb 16 00:19:31 CST 2015 [hadoop@baolibin current]$
12、命令清单:
hadoop fs
-help [cmd] //显示命令的帮助信息 -ls(r) <path> //显示当前目录下所有文件 -du(s) <path> //显示目录中所有文件大小 -count[-q] <path> //显示目录中文件数量 -mv <src> <dst> //移动多个文件到目标目录 -cp <src> <dst> //复制多个文件到目标目录 -rm(r) //删除文件(夹) -put <localsrc><dst> //本地文件复制到hdfs -copyFromLocal //同put -moveFromLocal //从本地文件移动到hdfs -get [-ignoreCrc] <src><localdst> //复制文件到本地,可以忽略crc校验 -getmerge <src><localdst> //将源目录中的所有文件排序合并到一个文件中 -cat <src> //在终端显示文件内容 -text <src> //在终端显示文件内容 -copyToLocal [-ignoreCrc]<src> <localdst> //复制到本地 -moveToLocal <src><localdst> -mkdir <path> //创建文件夹 -touchz <path> //创建一个空文件
希望对初学者有所帮助。
0 0
- hdfs基本操作介绍
- HDFS基本操作
- HDFS API基本操作
- <hdfs>基本操作
- hive--hdfs基本操作
- HDFS的基本操作
- HDFS基本操作
- HDFS基本操作 javaApi
- HDFS Shell基本操作总结
- hadoop hdfs api基本操作
- HDFS的基本shell操作
- HDFS Shell基本操作总结
- HDFS 基本文件操作API
- hadoop的HDFS基本操作
- HDFS的Shell基本操作
- hadoop hdfs dfs基本操作
- hdfs基本操作-python接口
- 使用HDFS API实现hadoop HDFS文件系统的基本操作
- 代码大全第二版读书笔记 第四部分-语句 十七、不常见的控制结构
- OC_语法
- Apache+Tomcat实现集群及多应用对应多域名
- OC_语法
- SGU 134 Centroid
- HDFS基本操作
- 如何在vi中优雅地使用ex
- 黑马程序员——Java基础IO(一)——IO流概述、字符流、字节流、流操作规律
- JStat监控JVMGC内存
- 个人感受
- POJ 3164 Command Network(最小树形图模板)
- OC_ 对象与函数
- HDU 2082 找单词 (母函数)
- POJ 3984----迷宫问题(广搜)