0% found this document useful (0 votes)
39 views

Exp-2 Hadoop Commands

commands
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
39 views

Exp-2 Hadoop Commands

commands
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Hadoop commands:

Hadoop commands and linux commands almost same.

before run this hdfs commands please download sample datasets from these links and save in work
folder datasets folder

https://www.briandunning.com/sample-data/us-500.zip //extract this file and place in work/datasets


folder

https://raw.github.com/vincentarelbundock/Rdatasets/master/csv/datasets/Titanic.csv

http://archive.ics.uci.edu/ml/machine-learning-databases/00222/bank.zip inside bank-full.csv

resources:

https://hadoop.apache.org/docs/2.4.1/hadoop-project-dist/hadoop-common/FileSystemShell.html

1) hdfs dfs -mkdir <path>


Creates a directory named path in HDFS.

Creates any parent directories in path that are missing e. g. , mkdir −


pinLinux.

hdfs dfs -mlkdir /home


Similarly create many folders

hdfs dfs -mkdir /datasets

hdfs dfs -mkdir /titanicdata

hdfs dfs -mkdir /otherdataset

Venu Training more info pls call 8500002025


2) hdfs dfs -put <localSrc> <dest>

Copies the file or directory from the local file system identified by localSrc
to dest within the DFS.

hdfs dfs -put file:///home/hadoop/sh.txt /


hdfs dfs -put file:///home/hadoop/work/datasets/Titanic.csv /titanicdata
//Now you have stored data in HDFS. Now verify this data stored in hdfs or not

3) hdfs dfs -ls <path>

Lists the contents of the directory specified by path, showing the names,
permissions, owner, size and modification date for each entry.

hdfs dfs -ls /


hdfs dfs -ls /Titanicdata
//If you want to display recursively inside that folder, use -lsr or -ls -R

Venu Training more info pls call 8500002025


4) hdfs dfs -lsr <path>

Behaves like -ls, but recursively displays entries in all subdirectories of path. hdfs
dfs -ls -R /
hdfs dfs -lsr /
//now you have stored data in hdfs, if you want to read that data, uses -cat

5) hdfs dfs -cat <file-name>


Displays the contents of filename on stdout.
hdfs dfs -cat /sh.txt

6) hdfs dfs -mv <src><dest>


Moves the file or directory indicated by src to dest, within HDFS.
hdfs dfs -mv /Crime.csv /home/

Venu Training more info pls call 8500002025


MV also useful to rename the dataset
let example

7) hdfs dfs -cp <src> <dest>


Copies the file or directory identified by src to dest, within HDFS.
hdfs dfs -cp /data.txt /home/data/

8) hdfs dfs -rm <path>


Removes the file or empty directory identified by path.
hdfs dfs -rm /data.txt

Venu Training more info pls call 8500002025


9) hdfs dfs -rmr <path>
Removes the file or directory identified by path. Recursively deletes any
child entries
i. e. , filesorsubdirectoriesofpath.
hdfs dfs -rmr /home/

10) hdfs dfs -copyFromLocal <localSrc> <dest>


Identical to -put
hdfs dfs -copyFromLocal file:///home/hadoop/sh.txt /home/

11) hdfs dfs -moveFromLocal <localSrc> <dest>


Copies the file or directory from the local file system identified by localSrc
to dest within HDFS, and then deletes the local copy on success. hdfs dfs -
moveFromLocal /home/hadoop/process-bank-data /

Now local data is deleted

Venu Training more info pls call 8500002025


12) hdfs dfs -get [-crc] <src> <localDest>
Copies the file or directory in HDFS identified by src to the local file system
path identified by localDest.

hdfs dfs -get /process-bank-data /home/hadoop

Venu Training more info pls call 8500002025

You might also like