hdfs命令_HDFS命令
hdfs命令
In this lesson on Apache Hadoop HDFS commands, we will go through the most common commands which are used for Hadoop administration and to manage files present on a Hadoop cluster.
在本課程中,有關(guān)Apache Hadoop HDFS命令的內(nèi)容,我們將介紹最常用的命令,這些命令用于Hadoop管理和管理Hadoop集群上的文件。
HDFS命令 (HDFS Commands)
Hive commands can be run on any Hadoop cluster or you’re free to use any of the VMs offered by Hortonworks, Cloudera etc.
Hive命令可以在任何Hadoop群集上運(yùn)行,或者您可以自由使用Hortonworks , Cloudera等提供的任何VM。
In this guide, we will make use of Ubuntu 17.10 (GNU/Linux 4.13.0-37-generic x86_64) machine:
在本指南中,我們將使用Ubuntu 17.10(GNU / Linux 4.13.0-37-generic x86_64)計(jì)算機(jī):
Ubuntu Version
Ubuntu版本
Finally, we will make use of Hadoop v3.0.1 for this lesson:
最后,本課將使用Hadoop v3.0.1:
Hadoop version
Hadoop版本
Let’s get started.
讓我們開始吧。
Hadoop HDFS命令 (Hadoop HDFS Commands)
We will start with some very basic help commands and go into more detail as we go through this lesson.
我們將從一些非?;镜膸椭铋_始,并在本課程中進(jìn)行更詳細(xì)的介紹。
獲取所有HDFS命令 (Getting all HDFS Commands)
The simplest help command for Hadoop HDFS is the following with which we get all the available commands in Hadoop and how to use them:
以下是Hadoop HDFS最簡(jiǎn)單的幫助命令,通過它我們可以獲取Hadoop中所有可用的命令以及如何使用它們:
hadoop fs -helpLet’s see the output for this command:
讓我們看一下該命令的輸出:
Hadoop fs help
Hadoop fs幫助
The output was quite long actually as this prints all the available commands a brief on how to use those commands as well.
實(shí)際上,輸出相當(dāng)長(zhǎng),因?yàn)檫@會(huì)打印所有可用的命令,并簡(jiǎn)要說明如何使用這些命令。
有關(guān)特定Hadoop命令的幫助 (Help on specific Hadoop command)
The information printed from the last command was quite big as it printed all the commands. Finding help for a specific command is tricky in that output. Here is a command to narrow your search:
從最后一個(gè)命令打印的信息很大,因?yàn)樗蛴×怂忻睢?在該輸出中查找特定命令的幫助非常棘手。 這是縮小搜索范圍的命令:
hadoop fs -help lsLet’s see the output of this command:
讓我們看一下該命令的輸出:
Hadoop specific command guide
Hadoop特定命令指南
特定Hadoop命令的用法 (Usage of specific Hadoop command)
to know the syntax of each command, we don’t need t go anywhere apart from the terminal itself. We can know the syntax of a command on how to use it, use the usage option:
要知道每個(gè)命令的語法,除了終端本身,我們不需要走任何地方。 我們可以使用用法選項(xiàng)來了解有關(guān)如何使用命令的語法:
hadoop fs -usage lsLet’s see the output of this command:
讓我們看一下該命令的輸出:
Usage of Hadoop Command
Hadoop命令的用法
Apart from usage, it also shows all possible options for the command specified.
除用法外,它還顯示指定命令的所有可能選項(xiàng)。
列出fs文件和目錄 (Listing fs files and directories)
To list all the available files and subdirectories under default directory, just use the following command:
要列出默認(rèn)目錄下的所有可用文件和子目錄,只需使用以下命令:
hadoop fs -lsLet’s see the output for this command:
讓我們看一下該命令的輸出:
Listing all files
列出所有文件
We ran this in the root directory and that’s why the output.
我們?cè)诟夸浿羞\(yùn)行它,這就是輸出的原因。
制作HDFS目錄 (Making HDFS Directory)
We can make a new directory for Hadoop File System using the following command:
我們可以使用以下命令為Hadoop File System創(chuàng)建新目錄:
hadoop fs -mkdir /root/journaldev_bigdataNote that if you create a new directory inside the /user/ directory, Hadoop will have read/write permissions on the directory but with other directories, it only has read permission by default.
請(qǐng)注意,如果在/user/目錄中創(chuàng)建一個(gè)新目錄,則Hadoop將對(duì)該目錄具有讀/寫權(quán)限,但對(duì)于其他目錄,默認(rèn)情況下它僅具有讀權(quán)限。
將文件從本地文件系統(tǒng)復(fù)制到Hadoop FS (Copying file from Local file System to Hadoop FS)
To copy a file from Local file System to Hadoop FS, we can use a simple command:
要將文件從本地文件系統(tǒng)復(fù)制到Hadoop FS,我們可以使用一個(gè)簡(jiǎn)單的命令:
hadoop fs -copyFromLocal derby.log /root/journaldev_bigdataLet’s see the output for this command:
讓我們看一下該命令的輸出:
Copy File from local fs to HDFS
將文件從本地fs復(fù)制到HDFS
If instead of copying the file, you just want to move it, just make use of the
如果您只想移動(dòng)文件而不是復(fù)制文件,則只需使用-moveFromLocal option.-moveFromLocal選項(xiàng)。
磁盤使用情況 (Disk Usage)
We can see the disk usage of files under HDFS in a given directory with a simple option as shown:
我們可以通過一個(gè)簡(jiǎn)單的選項(xiàng)查看給定目錄中HDFS下文件的磁盤使用情況,如下所示:
hadoop fs -du /root/journaldev_bigdata/Let’s see the output for this command:
讓我們看一下該命令的輸出:
Disk Usage of a directory in HDFS
HDFS中目錄的磁盤使用情況
If you simply want to check disk usage of complete HDFS, run the following command:
如果您只想檢查完整HDFS的磁盤使用情況,請(qǐng)運(yùn)行以下命令:
Let’s see the output for this command:
讓我們看一下該命令的輸出:
Disk Usage of complete HDFS
完整HDFS的磁盤使用情況
清空垃圾數(shù)據(jù) (Empty Trash Data)
When we are sure that no files in the trash are usable, we can empty the trash in HDFS by deleting all files with the following command:
當(dāng)我們確定垃圾箱中沒有可用的文件時(shí),我們可以通過使用以下命令刪除所有文件來清空HDFS中的垃圾箱:
hadoop fs -expungeThis will simply delete all Trashed data in the HDFS and creates no output.
這將僅刪除HDFS中的所有Trashed數(shù)據(jù),并且不創(chuàng)建任何輸出。
修改文件的復(fù)制因子 (Modifying replication factor for a file)
As we already know, replication factor is the count by which a file is replicated across as Hadoop cluster and in its HDFS. We can modify the replication factor of a file using the following command:
眾所周知,復(fù)制因子是文件在Hadoop集群及其HDFS中被復(fù)制的計(jì)數(shù)。 我們可以使用以下命令修改文件的復(fù)制因子:
hadoop fs -setrep -w 1 /root/journaldev_bigdata/derby.logLet’s see the output of this command:
讓我們看一下該命令的輸出:
Modify replication factor in HDFS
修改HDFS中的復(fù)制因子
更新Hadoop目錄權(quán)限 (Updating Hadoop Directory permissions)
If you face permission related issues in Hadoop, run the following command:
如果您在Hadoop中遇到與權(quán)限相關(guān)的問題,請(qǐng)運(yùn)行以下命令:
hadoop fs -chmod 700 /root/journaldev_bigdata/With this command, you can provide and formulate the permissions given to a HDFS directory and restrict its access.
使用此命令,您可以提供和制定授予HDFS目錄的權(quán)限并限制其訪問。
刪除HDFS目錄 (Removing HDFS Directory)
We can remove an entire HDFS directory using the rm command:
我們可以使用rm命令刪除整個(gè)HDFS目錄:
hadoop fs -rm -r /root/journaldev_bigdataLet’s see the output for this command:
讓我們看一下該命令的輸出:
Removing directory from HDFS
從HDFS移除目錄
That’s all for a quick roundup on Hadoop HDFS commands.
這就是對(duì)Hadoop HDFS命令的快速總結(jié)。
翻譯自: https://www.journaldev.com/20624/hdfs-commands
hdfs命令
總結(jié)
以上是生活随笔為你收集整理的hdfs命令_HDFS命令的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: eclipse配置java开发环境_Ja
- 下一篇: ssh key生成