How to Start HDFS Files
To start HDFS files you have to format the configured HDFS files by executing following command:
$ hadoop namenode -format
After formatting HDFS files, use the following command to start the namenode as well as the data nodes as cluster.
To know more about HDFS check our previous post HDFS OVERVIEW
How to List Files in HDFS
In order to check the status of file in directory, you can use “ls”. Use the following command:
$ $HADOOP_HOME/bin/hadoop fs -ls <args>
How to Insert Data Files in HDFS
First you have to create a input file directory. Use following command to create a directory:
$ $HADOOP_HOME/bin/hadoop fs -mkdir /user/input
Since your file is still on your local system. You need to transfer file from local system to Hadoop file sytem. Use the following command:
$ $HADOOP_HOME/bin/hadoop fs -put /home/file.txt /user/input
Now you have Inserted file in HDFS. You can verify this using “ls” command as following:
$ $HADOOP_HOME/bin/hadoop fs -ls /user/input
How to Retrieving Data from HDFS
Example: You want to retrieve data of file name tipcircle.com.
Firstly use “cat” command to view data from HDFS servers. Use following command:
$ $HADOOP_HOME/bin/hadoop fs -cat /user/output/tipcircle.com
This step is reverse of Step 2 of How to Insert Data Files in HDFS shown above.
You have to move file from HDFS to your local server. Use following command to get the data to local server:
$ $HADOOP_HOME/bin/hadoop fs -get /user/output/ /home/hadoop_tp/
Shutting Down HDFS
To shutdown HDFS you can use following Command: