Hadoop Installation Final
Hadoop Installation Final
Hadoop Installation Final
hadoop-2.6.5.tar.gz
1. to home folder ( Login is same as India domain login) .
3. Generate Key
.ssh/authorized_keys
to make sure we haven't added extra keys that you weren't expecting.
2. Verify passwordless ssh
eg:
[aauser@dc2100-r1-node1 ~]$ which java
/usr/bin/java
[aauser@dc2100-r1-node1 ~]$ ll /usr/bin/java
lrwxrwxrwx 1 root root 22 Aug 3 18:31 /usr/bin/java -> /etc/alternatives/java
[aauser@dc2100-r1-node1 ~]$ ll /etc/alternatives/java
lrwxrwxrwx 1 root root 39 Aug 3 18:31 /etc/alternatives/java -> /usr/java/jdk1.7.0_67-cloudera/bin/java
5. Set the PATH and CLASSPATH variables appropriately in the .bashrc file in your home folder.
export JAVA_HOME=/usr/java/jdk1.7.0_67-cloudera/
export HADOOP_PREFIX=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME/libexec
export PATH=$PATH:$HADOOP_HOME/bin:$JAVA_HOME/bin
export HADOOP_INSTALL=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_CONF_DIR=$HADOOP_HOME
export HADOOP_PREFIX=$HADOOP_HOME
export HADOOP_LIBEXEC_DIR=$HADOOP_HOME/libexec
export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH
export HADOOP_CONF_DIR=$HADOOP_PREFIX/etc/hadoop
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
export HADOOP_HOME=/home/1099404/hadoop-2.3.0-cdh5.1.0
<property>
<name>dfs.name.dir</name>
<value>/home/<empid>/hadoop-2.6.5/dfs/nn</value>
</property>
<property>
<name>dfs.data.dir</name>
<value>/home/<empid>/hadoop-2.6.5/dfs/dn</value>
</property>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
NB: Make sure the directories configured as dfs.name.dir and dfs.data.dir are created.
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
note: If there is an error in this step check following commands . If the commands are showing
error then contact IS team.
I. Hostname -i
[xxxxxx@01hw744052]$ hostname -i
it will give 127.0.1.1 if everything is ok
ii. hostname
[xxxxxx@01hw744052]$ hostname
eg:01hw664400.ln.india.tcs.com
To start all the daemons run start-all.sh [present in sbin folder of installation dir]
hadoop fs -ls /
19. To copy files from HDFS onto local system, use below command. (Optional)
hadoop fs -copyToLocal /output1/part-r-00000 .