hadoop常用操作(hadoop fs)

来源:互联网 发布:fifa online3 mac 编辑:程序博客网 时间:2024/06/05 01:02

1、hadoop fs -mkdir -p:创建目录

[hdfs@localhost~]$ hadoop fs -mkdir -p /aaaa/test[hdfs@localhost~]$ hadoop fs -ls /Found 26 itemsdrwxr-x---   - root users          0 2016-11-25 14:37 /DataIntegritydrwxr-xr-x   - root users          0 2016-11-25 18:40 /Tempdrwxr-xr-x   - root users          0 2016-11-29 19:01 /Tmpdrwxr-x---   - hdfs users          0 2016-11-29 19:15 /aaaa[hdfs@localhost~]$ hadoop fs -ls /aaaaFound 1 itemsdrwxr-x---   - hdfs users          0 2016-11-29 19:15 /aaaa/test

2、hadoop fs -rm -r:删除目录
[hdfs@localhost~]$ hadoop fs -rm -r /aaaa/test16/11/29 19:28:44 INFO fs.TrashPolicyDefault: Namenode trash configuration: Deletion interval = 20160 minutes, Emptier interval = 1440 minutes.Moved: 'hdfs://defaultCluster/aaaa/test' to trash at: hdfs://defaultCluster/user/hdfs/.Trash/Current[hdfs@localhost~]$ hadoop fs -ls /aaaaFound 2 itemsdrwxr-x---   - root root          0 2016-11-29 19:25 /aaaa/test1drwxr-x---   - root root          0 2016-11-29 19:25 /aaaa/test2

3、hadoop fs -put:将本地系统文件上传至hdfs(/home/cs/test.txt-本地文件,/aaaa/test-hdfs中目录)
[hdfs@localhost~]$ hadoop fs -put /home/cs/test.txt  /aaaa/test[hdfs@localhost~]$ hadoop fs -ls /aaaa/testFound 1 items-rw-r-----   3 hdfs users          0 2016-11-29 19:19 /aaaa/test/test.txt

4、hadoop fs -get:将hdfs上文件下载至本地文件系统中
[hdfs@localhost~]$ hadoop fs -get /aaaa/test/test.txt  /home/cs/qqqq

5、hadoop fs -cp -f:将本集群文件拷贝至另一集群hdfs中(hdfs://10.9.168.12:9000/bbb/,另一集群hdfs)
[hdfs@localhost~]$ hadoop fs -cp -f  /aaa/test/test.txt hdfs://10.9.168.12:9000/bbb/

6、hadopp fs -du -h:查看hdfs上文件的大小
[hdfs@localhost~]$ hadoop  fs -du -h /aaaa0  0  /aaaa/test0  0  /aaaa/test10  0  /aaaa/test2

7、hadoop fs -chown -R:修改hdfs上文件的用户名和组名
[hdfs@localhost~]$ hadoop fs -ls /aaaaFound 3 itemsdrwxr-x---   - hdfs users          0 2016-11-29 19:19 /aaaa/testdrwxr-x---   - hdfs users          0 2016-11-29 19:25 /aaaa/test1drwxr-x---   - hdfs users          0 2016-11-29 19:25 /aaaa/test2[hdfs@localhost~]$ hadoop fs -chown -R root:root /aaaa[hdfs@localhost~]$ hadoop fs -ls /aaaaFound 3 itemsdrwxr-x---   - root root          0 2016-11-29 19:19 /aaaa/testdrwxr-x---   - root root          0 2016-11-29 19:25 /aaaa/test1drwxr-x---   - root root          0 2016-11-29 19:25 /aaaa/test2


0 0
原创粉丝点击