Posted on

Hadoop Commands Guide

Hadoop Commands guide

Hadoop Commands Guide

All hadoop commands are invoked by the bin/hadoop script. Running the hadoop script without any arguments prints the description for all commands.

Hadoop – Learn Free HDFS Files Operations : What is HDFS operations.

Today here we will discuss all the common used hadoop commands. Lets start:

Hadoop Commands guide

Overview of hadoop commands

All of the Hadoop commands and subprojects follow the same basic structure: shellcommand [SHELL_OPTIONS] [COMMAND] [GENERIC_OPTIONS] [COMMAND_OPTIONS]

  • Shellcommand: The command of the project being invoked.
  • SHELL_OPTIONS: Options that the shell processes prior to executing Java.
  • COMMAND: Action to perform.
  • GENERIC_OPTIONS: The common set of options supported by multiple commands.

Shell Options

All of the shell commands will accept a common set of options. Following are commonly used shell option  within hadoop commands.

  • –buildpaths: Enables developer versions of jars.
  • –config confdir: Default Configuration directory is $HADOOP_HOME/etc/hadoop. This command overwrites this directory.
  • –debug: Enables shell level configuration debugging information.
  • –help: Shell script usage information.
  • –hostnames: When --workers is used, override the workers file with a space delimited list of hostnames where to execute a multi-host subcommand.
  • –hosts: When --workers is used, override the workers file with another file that contains a list of hostnames where to execute a multi-host subcommand.
  • –workers: If possible, execute this command on all hosts in the workers file.

The Command executed by Shell commands are distributed into two parts.

  1. User Commands: Commands useful for users of a hadoop cluster.
  2. Administration Commands: Commands useful for administrators of a hadoop cluster.

User Hadoop Commands

Archives

Hadoop archives are special format archives. A Hadoop archive maps to a file system directory. “.har” is the extension for archives.

Below is link to our post ARCHIVES GUIDE : 

Hadoop Archives Guide – How to Create an Archive in Hadoop

Checknative

There are two commands_option in checknative [-a] [-h].

  1. -a“: Check all libraries are available.
  2. -h“: print help.

Below is link to our seperate post on Native Hadoop Library:

Learn What is and How to Create Native Hadoop Library.

Classpath

There are three commands_option in classpath [–glob |–jar <path> |-h |–help].

  1. –glob: expand wildcards.
  2. –jarpath: write classpath as manifest in jar named path.
  3. -h, –help: print help.

Prints the class path needed to get the Hadoop jar and the required libraries. If called without arguments, then prints the classpath set up by the command scripts, which is likely to contain wildcards in the classpath entries. Additional options print the classpath after wildcard expansion or write the classpath into the manifest of a jar file. The latter is useful in environments where wildcards cannot be used and the expanded classpath exceeds the maximum supported command line length.

Credential

  1. create alias[-provider-provider-path][-strict] [-value credential-value]: Prompts the user for a credential to be stored as the given alias. The hadoop.security.credential.provider.path within the core-site.xml file will be used unless a -provider is indicated. The -strict flag will cause the command to fail if the provider uses a default password. Use -value flag to supply the credential value instead of being prompted.
  2. delete alias [-provider provider] [-strict] [-f]:The hadoop.security.credential.provider.path within the core-site.xml file will be used unless a -provider is indicated. The -strict flag will cause the command to fail if the provider uses a default password. The command asks for confirmation unless -f is specified.

Distch

  1. -f: List of objects to change.
  2. -i: Ignore failures.
  3. -log: Directory to log output.

Distcp

We have covered DistCp in seperate Post.

What is DistCp (Distributed copy) in Hadoop.

Dtutil

  1. print: Print out the fields in the tokens contained in filename.
  2. get URL: Fetch a token from service at URL and place it in filename.
  3. append: Append the contents of the first N filenames onto the last filename.
  4. remove -alias alias: From each file specified, remove the tokens matching alias and write out each file using specified format.
  5. cancel -alias alias: Just like remove, except the tokens are also cancelled using the service specified in the token object.
  6. renew -alias alias: For each file specified, renew the tokens matching alias and write out each file using specified format.

jar

Runs a jar file.

jnipath

Print the computed java.library.path.

kerbname

Convert the named principal via the auth_to_local rules to the Hadoop user name.

Administration Hadoop Commands

daemonlog

  1. -getlevel host:port classname [-protocol (http|https)}: Prints the log level of the log identified by a qualified classname, in the daemon running at host:port. The -protocol flag specifies the protocol for connection.
  2. -setlevel host:port classname level [-protocol (http|https)]: Sets the log level of the log identified by a qualified classname, in the daemon running at host:port. The -protocol flag specifies the protocol for connection.

This command only works by sending a HTTP/HTTPS request to the daemon’s internal Jetty servlet, following are the daemons that this command supports:

  • HDFS
  1. name node
  2. secondary name node
  3. data node
  4. journal node
  • YARN
  1. resource manager
  2. node manager
  3. Timeline server

Files

  1. etc/hadoop/hadoop-env.sh: This file stores the global settings used by all Hadoop shell commands.
  2. etc/hadoop/hadoop-user-functions.sh: This file allows for advanced users to override some shell functionality.
  3. ~/.hadooprc: This stores the personal environment for an individual user. It is processed after the hadoop-env.sh and hadoop-user-functions.sh files and can contain the same settings.