0% found this document useful (0 votes)
35 views3 pages

Hadoop Commands

The document lists Hadoop commands for interacting with HDFS, including commands to list, create, copy, move, delete, display, and set permissions on files and directories in HDFS.

Uploaded by

anmolshubham
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views3 pages

Hadoop Commands

The document lists Hadoop commands for interacting with HDFS, including commands to list, create, copy, move, delete, display, and set permissions on files and directories in HDFS.

Uploaded by

anmolshubham
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

HADOOP COMMANDS

Certainly! Here's the list of commands with `/user/cloudera/` as the HDFS path and `/home/cloudera/` as the local
file system path:

# List files/directories in /user/cloudera/


hadoop fs -ls /user/cloudera/

# Create a directory in /user/cloudera/


hadoop fs -mkdir /user/cloudera/new_directory

# Copy files from local file system (/home/cloudera/) to HDFS under /user/cloudera/
hadoop fs -copyFromLocal /home/cloudera/local_file.txt /user/cloudera/

# Copy files from HDFS under /user/cloudera/ to local file system (/home/cloudera/)
hadoop fs -copyToLocal /user/cloudera/hdfs_file.txt /home/cloudera/

# Move files/directories from one location to another within /user/cloudera/


hadoop fs -mv /user/cloudera/source.txt /user/cloudera/destination.txt

# Delete a file/directory in /user/cloudera/


hadoop fs -rm /user/cloudera/file_to_delete.txt

# Display the contents of a file in /user/cloudera/


hadoop fs -cat /user/cloudera/file_to_display.txt

# Create a zero-length file in /user/cloudera/


hadoop fs -touchz /user/cloudera/empty_file.txt

# Append contents of a local file to a file in HDFS under /user/cloudera/


hadoop fs -appendToFile /home/cloudera/local_file.txt /user/cloudera/hdfs_file.txt

# Copy files from source to destination in HDFS under /user/cloudera/


hadoop fs -cp /user/cloudera/source.txt /user/cloudera/destination.txt

# Display the last kilobyte of a file in /user/cloudera/ to stdout


hadoop fs -tail /user/cloudera/file_to_display.txt

# Get the checksum of a file in /user/cloudera/


hadoop fs -checksum /user/cloudera/file_to_check.txt

# Show disk usage information about the specified path in /user/cloudera/


hadoop fs -df /user/cloudera/

# Show disk usage of files and directories under /user/cloudera/


hadoop fs -du /user/cloudera/

# Count the number of directories, files, and bytes under the given path /user/cloudera/
hadoop fs -count /user/cloudera/

# Show file/directory status in /user/cloudera/


hadoop fs -stat /user/cloudera/file_or_directory.txt

# Set the replication level of files under /user/cloudera/


hadoop fs -setrep -R 3 /user/cloudera/
# Set Access Control Lists (ACLs) for files and directories under /user/cloudera/
hadoop fs -setfacl -R user:username:rwx /user/cloudera/

# Display the Access Control Lists (ACLs) for files and directories under /user/cloudera/
hadoop fs -getfacl /user/cloudera/

# Get the current safe mode status


hadoop dfsadmin -safemode get

# Enter safe mode


hadoop dfsadmin -safemode enter

# Leave safe mode


hadoop dfsadmin -safemode leave

Make sure to replace `/user/cloudera/` and `/home/cloudera/` with the appropriate paths on HDFS and your local
file system respectively.

Here's the list of commands with `/user/cloudera/` as the HDFS path and `/home/cloudera/` as the local file system
path, formatted with `hdfs dfs` commands instead of `hadoop fs`:

# List files/directories in /user/cloudera/ → hdfs dfs -ls /user/cloudera/


# Create a directory in /user/cloudera/ → hdfs dfs -mkdir /user/cloudera/new_directory
# Copy files from local file system (/home/cloudera/) to HDFS under /user/cloudera/
→ hdfs dfs -copyFromLocal /home/cloudera/local_file.txt /user/cloudera/
# Copy files from HDFS under /user/cloudera/ to local file system (/home/cloudera/)
→ hdfs dfs -copyToLocal /user/cloudera/hdfs_file.txt /home/cloudera/
# Move files/directories from one location to another within /user/cloudera/
→ hdfs dfs -mv /user/cloudera/source.txt /user/cloudera/destination.txt
# Delete a file/directory in /user/cloudera/ → hdfs dfs -rm /user/cloudera/file_to_delete.txt
# Display the contents of a file in /user/cloudera/ → hdfs dfs -cat /user/cloudera/file_to_display.txt
# Create a zero-length file in /user/cloudera/ → hdfs dfs -touchz /user/cloudera/empty_file.txt
# Append contents of a local file to a file in HDFS under /user/cloudera/
→ hdfs dfs -appendToFile /home/cloudera/local_file.txt /user/cloudera/hdfs_file.txt
# Copy files from source to destination in HDFS under /user/cloudera/
→ hdfs dfs -cp /user/cloudera/source.txt /user/cloudera/destination.txt
# Display the last kilobyte of a file in /user/cloudera/ to stdout
→ hdfs dfs -tail /user/cloudera/file_to_display.txt
# Get the checksum of a file in /user/cloudera/ → hdfs dfs -checksum /user/cloudera/file_to_check.txt
# Show disk usage information about the specified path in /user/cloudera/ → hdfs dfs -df /user/cloudera/
# Show disk usage of files and directories under /user/cloudera/ → hdfs dfs -du /user/cloudera/
# Count the number of directories, files, and bytes under the given path /user/cloudera/
→ hdfs dfs -count /user/cloudera/
# Show file/directory status in /user/cloudera/ → hdfs dfs -stat /user/cloudera/file_or_directory.txt
# Set the replication level of files under /user/cloudera/ → hdfs dfs -setrep -R 3 /user/cloudera/
# Set Access Control Lists (ACLs) for files and directories under /user/cloudera/
→ hdfs dfs -setfacl -R user:username:rwx /user/cloudera/
# Display the Access Control Lists (ACLs) for files and directories under /user/cloudera/
→ hdfs dfs -getfacl /user/cloudera/
# Get the current safe mode status → hdfs dfsadmin -safemode get
# Enter safe mode → hdfs dfsadmin -safemode enter
# Leave safe mode → hdfs dfsadmin -safemode leave
Make sure to replace `/user/cloudera/` and `/home/cloudera/` with the appropriate paths on HDFS and your local
file system respectively.

You might also like