Hadoop - copyFromLocal Command Last Updated : 04 Aug, 2025 Comments Improve Suggest changes Like Article Like Report The Hadoop copyFromLocal command copies files from the local file system (Linux/Windows/Mac) into HDFS for storage or processing. It is commonly used to upload datasets, logs or configuration files. The -f option forces overwriting of existing files in HDFS, similar to deleting the old file before copying. Without -f, Hadoop throws an error if the file already exists.Syntaxhdfs dfs -copyFromLocal [-f] <source_path1> <source_path2> ... <destination> Parameters:source_path: The file(s) in your local file system. Multiple files can be specified.destination: Must be a directory in HDFS where files will be copied.-f: Optional switch to force overwrite if the file already exists.Note: The copyFromLocal command is identical to -put and can be used with either hdfs dfs or hadoop fs. The last argument always specifies the destination path in HDFS.Steps to execute copyFromLocal CommandLet's see the current view of my Root directory in HDFS. Step 1: Make a directory in HDFS where you want to copy this file with the below command. hdfs dfs -mkdir /Hadoop_FileStep 2: Use copyFromLocal command as shown below to copy it to HDFS /Hadoop_File directory. hdfs dfs -copyFromLocal /home/dikshant/Documents/hadoop_file/Salaries.csv /Hadoop_FileStep 3: Check whether the file is copied successfully or not by moving to its directory location with below command. hdfs dfs -ls /Hadoop_FileOverwriting or Updating the File In HDFS with -f switchFrom below Image, you can observe that copyFromLocal command itself does not copy the same name file at the same location. it says that the file already exists. To update the content of the file or to Overwrite it, you should use -f switch as shown below.hdfs dfs -copyFromLocal -f /home/dikshant/Documents/hadoop_file/Salaries.csv /Hadoop_FileNow you can easily observe that using copyFromLocal with -f switch does not produce any error or it will easily update or modify your file in HDFS. Comment More infoAdvertise with us Next Article rcp Command in Linux with examples D dikshantmalidev Follow Improve Article Tags : Data Engineering Hadoop Similar Reads rcp Command in Linux with examples When working in a Linux environment, there often comes a time when you need to transfer files from one computer to another. While more secure options like scp or rsync exist, the rcp (Remote Copy Protocol) command offers a simple and efficient way to copy files between systems, especially for beginn 5 min read How to Use the Xcopy Command in Windows? The xcopy command in Windows is a powerful tool for copying files and directories. It extends the functionality of the standard copy command by allowing the copy of entire directories along with their subdirectories and files all at once. This makes xcopy highly useful for tasks such as backing up f 4 min read PHP copy( ) Function The copy() function in PHP is an inbuilt function which is used to make a copy of a specified file. It makes a copy of the source file to the destination file and if the destination file already exists, it gets overwritten. The copy() function returns true on success and false on failure. Syntax: bo 2 min read How to Clone all Remote Branches in Git? Cloning a repository is a common task when working with Git, but what if you need all the branches from a remote repository? By default, Git clones only the default branch (usually 'master' or 'main'). However, you may need access to all remote branches in many scenarios. This article will guide you 2 min read How to Copy Files and Directories in Linux | cp Command The cp (copy) command is your go-to tool in Linux for duplicating files and folders quickly. Whether youâre backing up data, organizing files, or sharing content, cp lets you copy items between locations while keeping the original intact. The cp command requires at least two filenames in its argumen 8 min read Copy Files Recursively using AWS S3 cp Command Amazon S3(Simple Storage Service) is AWS's powerful and scalable solution for storing data in the cloud. It offers a reliable and cost-effective way to store and retrieve data of all kinds. Among its many features, the aws s3 cp command is particularly useful for transferring files. Whether you need 8 min read Like