欢迎投稿

今日深度:

hadoop常用操作(hadoop fs),

hadoop常用操作(hadoop fs),


1、hadoop fs -mkdir -p:创建目录

[hdfs@localhost~]$ hadoop fs -mkdir -p /aaaa/test
[hdfs@localhost~]$ hadoop fs -ls /
Found 26 items
drwxr-x---   - root users          0 2016-11-25 14:37 /DataIntegrity
drwxr-xr-x   - root users          0 2016-11-25 18:40 /Temp
drwxr-xr-x   - root users          0 2016-11-29 19:01 /Tmp
drwxr-x---   - hdfs users          0 2016-11-29 19:15 /aaaa
[hdfs@localhost~]$ hadoop fs -ls /aaaa
Found 1 items
drwxr-x---   - hdfs users          0 2016-11-29 19:15 /aaaa/test

2、hadoop fs -rm -r:删除目录
[hdfs@localhost~]$ hadoop fs -rm -r /aaaa/test
16/11/29 19:28:44 INFO fs.TrashPolicyDefault: Namenode trash configuration: Deletion interval = 20160 minutes, Emptier interval = 1440 minutes.
Moved: 'hdfs://defaultCluster/aaaa/test' to trash at: hdfs://defaultCluster/user/hdfs/.Trash/Current
[hdfs@localhost~]$ hadoop fs -ls /aaaa
Found 2 items
drwxr-x---   - root root          0 2016-11-29 19:25 /aaaa/test1
drwxr-x---   - root root          0 2016-11-29 19:25 /aaaa/test2

3、hadoop fs -put:将本地系统文件上传至hdfs(/home/cs/test.txt-本地文件,/aaaa/test-hdfs中目录)
[hdfs@localhost~]$ hadoop fs -put /home/cs/test.txt  /aaaa/test
[hdfs@localhost~]$ hadoop fs -ls /aaaa/test
Found 1 items
-rw-r-----   3 hdfs users          0 2016-11-29 19:19 /aaaa/test/test.txt

4、hadoop fs -get:将hdfs上文件下载至本地文件系统中
[hdfs@localhost~]$ hadoop fs -get /aaaa/test/test.txt  /home/cs/qqqq

5、hadoop fs -cp -f:将本集群文件拷贝至另一集群hdfs中(hdfs://10.9.168.12:9000/bbb/,另一集群hdfs)
[hdfs@localhost~]$ hadoop fs -cp -f  /aaa/test/test.txt hdfs://10.9.168.12:9000/bbb/

6、hadopp fs -du -h:查看hdfs上文件的大小
[hdfs@localhost~]$ hadoop  fs -du -h /aaaa
0  0  /aaaa/test
0  0  /aaaa/test1
0  0  /aaaa/test2

7、hadoop fs -chown -R:修改hdfs上文件的用户名和组名
[hdfs@localhost~]$ hadoop fs -ls /aaaa
Found 3 items
drwxr-x---   - hdfs users          0 2016-11-29 19:19 /aaaa/test
drwxr-x---   - hdfs users          0 2016-11-29 19:25 /aaaa/test1
drwxr-x---   - hdfs users          0 2016-11-29 19:25 /aaaa/test2
[hdfs@localhost~]$ hadoop fs -chown -R root:root /aaaa
[hdfs@localhost~]$ hadoop fs -ls /aaaa
Found 3 items
drwxr-x---   - root root          0 2016-11-29 19:19 /aaaa/test
drwxr-x---   - root root          0 2016-11-29 19:25 /aaaa/test1
drwxr-x---   - root root          0 2016-11-29 19:25 /aaaa/test2


www.htsjk.Com true http://www.htsjk.com/Hadoop/38787.html NewsArticle hadoop常用操作(hadoop fs), 1、hadoop fs -mkdir -p:创建目录 [hdfs@localhost~]$ hadoop fs -mkdir -p /aaaa/test[hdfs@localhost~]$ hadoop fs -ls /Found 26 itemsdrwxr-x--- - root users 0 2016-11-25 14:37 /DataIntegritydrwxr-xr-x...
相关文章
    暂无相关文章
评论暂时关闭