How To Handle 100 Million Rows With SQL Server BCP
How To Handle 100 Million Rows With SQL Server BCP
In this article, we will explore the Bulk Copy Program tool that is also known as the SQL Server BCP
tool. BCP allows us to export data into particular flat-file formats from SQL Server, and it also enables
us to transfer data between different SQL Server instances or into the SQL Azure.
Introduction
Either exporting or importing the data is among the most needed operations of the database persons.
In order to achieve this operation on SQL Server, we have various tool alternatives. Such as SQL Server
Integration Services (SSIS), SQL Server Management Studio (SSMS), OPENROWSET function, and BCP
are the first options that come to our minds. SQL Server BCP is a very simple command-line tool that
exports table data into flat files from SQL Server, or it can import data into the database tables. How‐
ever, when we want to use the BCP for the giant data transferring operations we might face some per‐
formance problems. For this reason, in the next sections of this article, we will mainly focus on how to
improve the performance of the data transferring operations for the tables which have a huge number
of rows.
Pre-requirements
For the examples in this article, we will create a table and populate it with 100 million rows. The fol‐
lowing script
creates the SalesPerson table and we can also use ApexSQL Generate to generate 100
million test data.
On the other hand, we will use the Adventureworks sample database for the first look examples.
If the installed version is older than the last version, we can download and install the latest version
from the
Microsoft website. The main capability of the SQL Server BCP is not much complex
because it
can only run with several arguments. The syntax of the BCP is like below:
For example, if we want to export any data of a table to a text file, we have to specify the table name,
the out option, and the data file. The following command will export the Production
table into the
specified text file.
https://www.sqlshack.com/how-to-handle-100-million-rows-with-sql-server-bcp/ 2/12
14/7/2021 How to handle 100 million rows with SQL Server BCP
After the exporting operation, the text file will look as below:
At the same time, we can export result sets of the queries through the queryout parameter so that
we can
filter the data or join the tables before exporting operation.
The imported data can be seen in the query editor of SQL Azure.
https://www.sqlshack.com/how-to-handle-100-million-rows-with-sql-server-bcp/ 4/12
14/7/2021 How to handle 100 million rows with SQL Server BCP
https://www.sqlshack.com/how-to-handle-100-million-rows-with-sql-server-bcp/ 5/12
14/7/2021 How to handle 100 million rows with SQL Server BCP
In our first testing, we will run the SQL Server BCP with default values in order to export 100 M rows.
As we can see in the above image, the export operation has completed about 588.079 seconds and it
has exported 170.045 rows per second. The network packet size configuration allows us to specify
how much bytes data is sent out by the SQL Server. The –a parameter changes the packet size
individually for the bcp data transfer session and might help to increase the performance of data
transfer
operations. The default packet size is 4096 bytes and we will increase this number to
32.728
bytes and it will affect the performance of the data exporting positively.
https://www.sqlshack.com/how-to-handle-100-million-rows-with-sql-server-bcp/ 6/12
14/7/2021 How to handle 100 million rows with SQL Server BCP
After changing the packet size parameter of the BCP the data transfer duration has decreased and the
number of rows transferred per second has increased so increasing the packet size may have an im‐
provement in data transfer. As a
last, using the fast disk systems for the exported file location, chang‐
ing the packet size parameter of the BCP, and using the fast NIC card will improve the export
performance.
Before starting the first testing with BCP, we will take a look at the log file size of the database and
then we
will start the data import operation for 100 M rows.
https://www.sqlshack.com/how-to-handle-100-million-rows-with-sql-server-bcp/ 7/12
14/7/2021 How to handle 100 million rows with SQL Server BCP
After the completion of the data import operation, the log file size has reached 27.144 megabytes.
On the other hand, using the simple recovery model or bulk-logged recovery model may enable the
minimal logging
mode. For SQL Server to enable the minimum logging option, the target table must
meet the following conditions:
In addition to these conditions, we need to use the TABLOCK hint in the BCP command, and also the
table must be empty if it includes a clustered index.
After changing the database recovery model to bulk-logged, we drop and re-create the target table
and re-execute the
following bcp command with the TABLOCK hint.
After the completion of the data import operation, the log file size has only reached 200 megabytes.
In this test, we have realized that SQL Server minimizes the log file activity and it increases the perfor‐
mance of
the bulk copy operation.
https://www.sqlshack.com/how-to-handle-100-million-rows-with-sql-server-bcp/ 9/12
14/7/2021 How to handle 100 million rows with SQL Server BCP
T -n -h”TABLOCK”
After using the native data format in BCP, the data import performance has boosted because the un‐
necessary data conversions are eliminated.
The following chart shows how the minimal logging mode and using the native data format option af‐
fect the data
importing performance.
Conclusion
In this article, we have explored the SQL Server BCP tool and also we have focused on how to improve
its performance with some changes. Minimal logging mode and using the native formats dramatically
increase the performance of the
BCP. At the same time, the packet size parameter of the BCP can af‐
fect the performance of the data transfer performance of the BCP.
See more
To generate millions of rows of test data quickly, consider ApexSQL Generate, a test data generator
specifically designed for SQL Server developers
https://www.sqlshack.com/how-to-handle-100-million-rows-with-sql-server-bcp/ 10/12
14/7/2021 How to handle 100 million rows with SQL Server BCP
Esat Erkec
Esat Erkec is a SQL Server professional who began his career 8+ years ago as
a Software Developer. He is a SQL Server Microsoft Certified Solutions Expert.
Most of his career has been focused on SQL Server Database Administration and
Development. His current interests are in database administration and Business
Intelligence. You can find him on LinkedIn.
Related Posts:
1. How to import data from an Excel file to a SQL Server database
2. How to import/export data to SQL Server using the SQL Server Import and Export Wizard
3. How to export data from SQL Server to a Flat file
4. An introduction to the bcp Utility (bulk copy program) in SQL Server
5. The BCP (Bulk Copy Program) command in action
Migration, Performance
3,195 Views
ALSO ON SQL SHACK
This article will describe an Best author award in 2020 This article will show how This artic
overview of the Azure Data we can install MySQL export the
Lake Analytics and U-SQL. Server using noinstall ZIP … indexes a
https://www.sqlshack.com/how-to-handle-100-million-rows-with-sql-server-bcp/ 11/12
14/7/2021 How to handle 100 million rows with SQL Server BCP
Thank you for putting up this article regarding the BCP tool, its usage explanation and the ways
to use this tool for exporting and importing table with 100 millions rows with screenshot
demonstrations. A very insightful information needed during this time period.
I noticed a couple of things that may need to be updated: you mentioned the packet size is
32.728 bytes, but in the BCP screen it showed 32756 bytes. Second, the log file size after data
import is mentioned 27.144MB but the screenshot shows 27143.99MB. Last, before the
Conclusion section, you have mentioned about a chart showing how the minimal logging mode
and using the native data format option affect the data importing performance, but the chart is
not displayed.
✉S d ⚠ S
© 2021 Quest Software Inc. ALL RIGHTS RESERVED. | GDPR | Terms of Use | Privacy
https://www.sqlshack.com/how-to-handle-100-million-rows-with-sql-server-bcp/ 12/12