Practical N0.2 AIM: Install Hadoop Hadoop Installation On Windows 10
Practical N0.2 AIM: Install Hadoop Hadoop Installation On Windows 10
2
AIM : Install Hadoop
After downloading java version 1.8, download hadoop version 3.1 from this link –
https://archive.apache.org/dist/hadoop/common/hadoop-3.1.0/hadoop-3...
Extract it to a folder.
Likewise, create a new user variable with variable name as JAVA_HOME and variable
value as the path of the bin folder in the Java directory.
Now we need to set Hadoop bin directory and Java bin directory path in system variable
path.
Edit Path in system variable
Click on New and add the bin directory path of Hadoop and Java in it.
Configurations
Now we need to edit some files located in the hadoop directory of the etc folder where
we installed hadoop. The files that need to be edited have been highlighted.
1. Edit the file core-site.xml in the hadoop directory. Copy this xml property in the
configuration in the file
/span>configuration>
1 /span>property>
2 /span>name>fs.defaultFS/span>/name>
3 /span>value>hdfs://localhost:9000</value
4>
5 /span>/property>
6/span>/configuration>
2. Edit mapred-site.xml and copy this property in the cofiguration
/span>configuration>
1 /span>property>
2 /span>name>mapreduce.framework.name/span>/name
3>
4 /span>value>yarn/span>/value>
5 /span>/property>
6/span>/configuration>
3. Create a folder ‘data’ in the hadoop directory
Create a folder with the name ‘datanode’ and a folder ‘namenode’ in this data directory
4. Edit the file hdfs-site.xml and add below property in the configuration
Note: The path of namenode and datanode across value would be the path of the
datanode and namenode folders you just created.
1 /span>configuration>
2 /span>property>
3 /span>name>dfs.replication/span>/name>
4 /span>value>1/span>/value>
5 /span>/property>
6 /span>property>
7 /span>name>dfs.namenode.name.dir/span>/name>
8 /span>value>C:\Users\hp\Downloads\hadoop-3.1.0\hadoop-
9 3.1.0\data\namenode/span>/value>
10 /span>/property>
11 /span>property>
12 /span>name>dfs.datanode.data.dir/span>/name>
13 /span>value> C:\Users\hp\Downloads\hadoop-3.1.0\hadoop-
143.1.0\data\datanode/span>/value>
/span>/property>
/span>/configuration>
5. Edit the file yarn-site.xml and add below property in the configuration
1 /span>configuration>
2 /span>property>
3 /span>name>yarn.nodemanager.aux-services/span>/name>
4 /span>value>mapreduce_shuffle/span>/value>
5 /span>/property>
6 /span>property>
7 /span>name>yarn.nodemanager.auxservices.mapreduce.shuffle.class/span>/name>
8 /span>value>org.apache.hadoop.mapred.ShuffleHandler/span>/value>
9 /span>/property>
10/span>/configuration>
6. Edit hadoop-env.cmd and replace %JAVA_HOME% with the path of the java folder
where your jdk 1.8 is installed
Hadoop needs windows OS specific files which does not come with default download of
hadoop.
To include those files, replace the bin folder in hadoop directory with the bin folder
provided in this github link.
https://github.com/s911415/apache-hadoop-3.1.0-winutils
Download it as zip file. Extract it and copy the bin folder in it. If you want to save the old
bin folder, rename it like bin_old and paste the copied bin folder in that directory.
Check whether hadoop is successfully installed by running this command on cmd-
1hadoop version
Since it doesn’t throw error and successfully shows the hadoop version, that means
hadoop is successfully installed in the system.
Format the NameNode
Formatting the NameNode is done once when hadoop is installed and not for running
hadoop filesystem, else it will delete all the data inside HDFS. Run this command-
hdfs namenode –
1format
It would appear something like this –
Now change the directory in cmd to sbin folder of hadoop directory with this command,
(Note: Make sure you are writing the path as per your system)
1cd C:\Users\hp\Downloads\hadoop-3.1.0\hadoop-3.1.0\sbin
Start namenode and datanode with this command –
1start-dfs.cmd
Two more cmd windows will open for NameNode and DataNode
Now start yarn through this command-
1start-yarn.cmd
Two more windows will open, one for yarn resource manager and one for yarn node
manager.
Note: Make sure all the 4 Apache Hadoop Distribution windows are up n running. If they
are not running, you will see an error or a shutdown message. In that case, you need to
debug the error.
To access information about resource manager current jobs, successful and failed jobs,
go to this link in browser-
http://localhost:8088/cluster