Exp 2 Bda
Exp 2 Bda
directory.
Step 2: Check whether hdfs is on or not by using the ‘hdfs dfs –ls’ command.
Step 3: Create a file using the ‘nano’ command. Enter the text into the file and save it. Press shift +
enter to exit the file.
Step 4: View the file content using the ‘cat’ command.
Step 5: Create a new directory in Hadoop using the command ‘hdfs dfs -mkdir directoryname’
Step 6: Use the `pwd` command which stands for "print working directory". It prints the path of the
current working directory.
Step 7: To copy the file from Linux to Hadoop use the command ‘hdfs dfs -put filename.txt
/user/cloudera/directoryname’. When ‘-put’ command is run hdfs will copy the text file from your
local file system to /user/cloudera/directory in HDFS.
Step 8: Use ‘-cat’ command along with the path of the file in hdfs to check the file content.
Step 10: Copy the file back to Linux. ‘-get’ command is used to fetch a file form a remote server or
another location. We use ‘hdfs dfs -get /user/cloudera/directoryname/filename.txt
/home/cloudera/directoryname’.
Step 11: Use ‘cd dir2’ to change directory to the one we created in Linux. cd stands for "change
directory". It is used to change the current working directory. We then use ‘ls’ to list the files in the
directory to check whether it has been copied back or not.