Salesforce Data Loader
Salesforce Data Loader
Salesforce Data Loader
@salesforcedocs
Last updated: January 28, 2016
Copyright 20002016 salesforce.com, inc. All rights reserved. Salesforce is a registered trademark of salesforce.com, inc.,
as are other names and marks. Other marks appearing herein may be trademarks of their respective owners.
CONTENTS
Chapter 1: Data Loader . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Chapter 2: When to Use Data Loader . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
Considerations for Installing Data Loader . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
Configure Data Loader . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
Data Loader Behavior with Bulk API Enabled . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Configure the Data Loader to Use the Bulk API . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Contents
CHAPTER 1
Data Loader
Data Loader is a client application for the bulk import or export of data. Use it to insert, update,
delete, or export Salesforce records.
EDITIONS
When importing data, Data Loader reads, extracts, and loads data from comma separated values
(CSV) files or from a database connection. When exporting data, it outputs CSV files.
Note: If commas are not appropriate for your locale, use a tab or other delimiter.
You can use Data Loader in two different ways:
User interfaceWhen you use the user interface, you work interactively to specify the
configuration parameters, CSV files used for import and export, and the field mappings that
map the field names in your import file with the field names in Salesforce.
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
Command line (Windows only)When you use the command line, you specify the configuration,
data sources, mappings, and actions in files. This enables you to set up Data Loader for automated
processing.
Data Loader offers the following key features:
An easy-to-use wizard interface for interactive use
An alternate command-line interface for automated batch operations (Windows only)
Support for large files with up to 5 million records
Drag-and-drop field mapping
Support for all objects, including custom objects
Can be used to process data in both Salesforce and Database.com
Detailed success and error log files in CSV format
A built-in CSV file viewer
Support for Windows XP, Windows 7, and Mac OS X
To get started, see the following topics:
When to Use Data Loader
Installing Data Loader
Note: In previous versions, Data Loader has been known as AppExchange Data Loader and Sforce Data Loader.
Considerations for
Installing Data
Loader
Configure Data
Loader
EDITIONS
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
You need to load into an object that is not yet supported by the import wizards.
You want to schedule regular data loads, such as nightly imports.
You want to export your data for backup purposes.
EDITIONS
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
USER PERMISSIONS
Installation Considerations
Over time, several versions of the Data Loader client application have been available for download. Some earlier versions were called
AppExchange Data Loader or Sforce Data Loader. You can run different versions at the same time on one computer. However, do
not install more than one copy of the same version.
The latest version is always available in Salesforce. If you have installed the latest version and want to install it again, first remove the
version on your computer.
Tip: If you experience login issues in the command line interface after upgrading to a new version of Data Loader, please try
re-encrypting your password to solve the problem. For information on the password encryption utility, see Encrypt from the
Command Line on page 19.
Note: The Data Loader command-line interface is supported for Windows only.
To make changes to the source code, download the open-source version of Data Loader from https://github.com/forcedotcom/dataloader.
Login Considerations
The latest version of Data Loader supports Web Server OAuth Authentication for both Windows and Mac, which provides an extra layer
of security compliance. See OAuth Authentication for more information.
If your organization restricts IP addresses, logins from untrusted IPs are blocked until theyre activated. Salesforce automatically sends
you an activation email that you can use to log in. The email contains a security token that you must add to the end of your password.
For example, if your password is mypassword, and your security token is XXXXXXXXXX, you must enter mypasswordXXXXXXXXXX
to log in.
EDITIONS
Description
Batch size
Assignment rule
Server host
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
Field
Description
turn off this automatic reset, disable this option.
Compression
Timeout
Field
Description
mind that hard deleted records are immediately deleted
and cant be recovered from the Recycle Bin.
Select this option to use serial instead of parallel processing for
Bulk API. Processing in parallel can cause database contention.
When this is severe, the load may fail. Using serial mode
guarantees that batches are processed one at a time. Note that
using this option may significantly increase the processing time
for a load.
Time Zone
If a date value does not include a time zone, this value is used.
If no value is specified, the time zone of the computer where
Data Loader is installed is used.
If an incorrect value is entered, GMT is used as the time zone
and this fact is noted in the Data Loader log.
Valid values are any time zone identifier which can be passed to
the Java getTimeZone(java.lang.String) method.
The value can be a full name such as
America/Los_Angeles, or a custom ID such as
GMT-8:00.
Proxy host
Proxy port
Proxy username
Proxy password
Start at row
If your last operation failed, you can use this setting to begin
where the last successful operation finished.
EDITIONS
The following settings are not available on the Settings > Settings page in Data Loader when the
Use Bulk API option is selected:
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
This option enables Data Loader to insert blank mapped values as null values during data
operations when the Bulk API is disabled. Empty field values are ignored when you update
records using the Bulk API. To set a field value to null when the Use Bulk API option
is selected, use a field value of #N/A.
This option directs Data Loader to truncate data for certain field types when the Bulk API is disabled. A load operation fails for the
row if a value is specified that is too large for the field when the Use Bulk API option is selected.
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
Data Types
Supported by Data
Loader
Export Data
Insert, Update, or
Delete Data Using
Data Loader
Uploading
Attachments
Reviewing Data
Loader Output Files
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
EDITIONS
Base64
String path to file (converts the file to a base64encoded array). Base64 fields are only used to
insert or update attachments and Salesforce CRM Content. For more information, see Uploading
Attachments on page 14 and Upload Content with the Data Loader on page 15.
Boolean
True values (case insensitive) = yes, y, true, on, 1
False values (case insensitive) = no, n, false, off, 0
Date Formats
We recommend you specify dates in the format yyyy-MM-ddTHH:mm:ss.SSS+/-HHmm:
yyyy is the four-digit year
MM is the two-digit month (01-12)
dd is the two-digit day (01-31)
HH is the two-digit hour (00-23)
mm is the two-digit minute (00-59)
ss is the two-digit seconds (00-59)
SSS is the three-digit milliseconds (000-999)
+/-HHmm is the Zulu (UTC) time zone offset
The following date formats are also supported:
yyyy-MM-dd'T'HH:mm:ss.SSS'Z'
yyyy-MM-dd'T'HH:mm:ss.SSS Pacific Standard Time
yyyy-MM-dd'T'HH:mm:ss.SSSPacific Standard Time
yyyy-MM-dd'T'HH:mm:ss.SSS PST
yyyy-MM-dd'T'HH:mm:ss.SSSPST
yyyy-MM-dd'T'HH:mm:ss.SSS GMT-08:00
yyyy-MM-dd'T'HH:mm:ss.SSSGMT-08:00
yyyy-MM-dd'T'HH:mm:ss.SSS -800
yyyy-MM-dd'T'HH:mm:ss.SSS-800
yyyy-MM-dd'T'HH:mm:ss
yyyy-MM-dd HH:mm:ss
yyyyMMdd'T'HH:mm:ss
yyyy-MM-dd
MM/dd/yyyy HH:mm:ss
MM/dd/yyyy
yyyyMMdd
Note the following tips for date formats:
Export Data
To enable date formats that begin with the day rather than the month, select the Use European date format box in
the Settings dialog. European date formats are dd/MM/yyyy and dd/MM/yyyy HH:mm:ss.
If your computer's locale is east of Greenwich Mean Time (GMT), we recommend that you change your computer setting to
GMT in order to avoid date adjustments when inserting or updating records.
Only dates within a certain range are valid. The earliest valid date is 1700-01-01T00:00:00Z GMT, or just after midnight on January
1, 1700. The latest valid date is 4000-12-31T00:00:00Z GMT, or just after midnight on December 31, 4000. These values are offset
by your time zone. For example, in the Pacific time zone, the earliest valid date is 1699-12-31T16:00:00, or 4:00 PM on December
31, 1699.
Double
Standard double string
ID
A Salesforce ID is a case-sensitive 15-character or caseinsensitive 18-character alphanumeric string that uniquely identifies a particular
record.
Tip: To ensure data quality, make sure that all Salesforce IDs you enter in Data Loader are in the correct case.
Integer
Standard integer string
String
All valid XML strings; invalid XML characters are removed.
Export Data
You can use the Data Loader export wizard to extract data from any Salesforce object. When you
export, you can choose to include (Export All) or exclude (Export) soft-deleted records.
EDITIONS
2. Click Export or Export All. These commands can also be found in the File menu.
3. Enter your Salesforce username and password. Click Log in to log in. After your login completes
successfully, click Next. (Until you log out or close the program, you will not be asked to log in
again.)
If your organization restricts IP addresses, logins from untrusted IPs are blocked until theyre
activated. Salesforce automatically sends you an activation email that you can use to log in. The
email contains a security token that you must add to the end of your password. For example,
if your password is mypassword, and your security token is XXXXXXXXXX, you must enter
mypasswordXXXXXXXXXX to log in.
4. Choose an object. For example, select the Account object. If your object name does not display
in the default list, check Show all objects to see a complete list of objects that you can
access. The objects will be listed by localized label name, with developer name noted in
parentheses. For object descriptions, see the SOAP API Developer's Guide.
5. Click Browse... to select the CSV file to which the data will be exported. You can enter a new
file name to create a new file or choose an existing file.
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
USER PERMISSIONS
To export records:
Read on the records
To export all records:
Read on the records
If you select an existing file, the contents of that file are replaced. Click Yes to confirm this action, or click No to choose another file.
6. Click Next.
10
Export Data
7. Create a SOQL query for the data export. For example, check Id and Name in the query fields and click Finish. As you follow the
next steps, you will see that the CSV viewer displays all the Account names and their IDs. SOQL is the Salesforce Object Query
Language that allows you to construct simple but powerful query strings. Similar to the SELECT command in SQL, SOQL allows you
to specify the source object, a list of fields to retrieve, and conditions for selecting rows in the source object.
a. Choose the fields you want to export.
b. Optionally, select conditions to filter your data set. If you do not select any conditions, all the data to which you have read access
will be returned.
c. Review the generated query and edit if necessary.
Tip: You can use a SOQL relationship query to include fields from a related object. For example:
Select Name, Pricebook2Id, Pricebook2.Name, Product2Id, Product2.ProductCode FROM
PricebookEntry WHERE IsActive = true
Or:
Select Id, LastName, Account.Name FROM Contact
When using relationship queries in Data Loader, the fully specified field names are case-sensitive. For example, using
ACCOUNT.NAME instead of Account.Name does not work.
Data Loader doesnt support nested queries or querying child objects. For example, queries similar to the following return an
error:
SELECT Amount, Id, Name, (SELECT Quantity, ListPrice,
PriceBookEntry.UnitPrice, PricebookEntry.Name,
PricebookEntry.product2.Family FROM OpportunityLineItems)
FROM Opportunity
Also, Data Loader doesnt support queries that make use of polymorphic relationships. For example, the following query results
in an error:
SELECT Id, Owner.Name, Owner.Type, Owner.Id, Subject FROM Case
For more information on SOQL, see the Force.com SOQL and SOSL Reference.
8. Click Finish, then click Yes to confirm.
9. A progress information window reports the status of the operation.
10. After the operation completes, a confirmation window summarizes your results. Click View Extraction to view the CSV file, or click
OK to close. For more details, see Reviewing Data Loader Output Files on page 16.
Note:
Data Loader currently does not support the extraction of attachments. As a workaround, we recommend that you use the
weekly export feature in the online application to export attachments.
If you select compound fields for export in the Data Loader, they cause error messages. To export values, use individual field
components.
11
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
3. Optionally, click Save Mapping to save this mapping for future use. Specify a name for the
SDL mapping file.
If you select an existing file, the contents of that file are replaced. Click Yes to confirm this action, or click No to choose another file.
4. Click OK to use your mapping for the current operation.
EDITIONS
To insert records:
To update records:
To upsert records:
To delete records:
The insert, update, upsert, delete, and hard delete wizards in Data Loader allow you to add new
records, modify existing records, or delete existing records. Note that upsert is a combination of
inserting and updating. If a record in your file matches an existing record, the existing record is updated with the values in your file. If
no match is found, then the record is created as new. When you hard delete records, the deleted records are not stored in the Recycle
Bin and become immediately eligible for deletion. For more information, see Configure Data Loader on page 4.
1. Open the Data Loader.
2. Click Insert, Update, Upsert, Delete or Hard Delete. These commands can also be found in the File menu.
3. Enter your Salesforce username and password. Click Log in to log in. After your login completes successfully, click Next. (Until you
log out or close the program, you are not asked to log in again.)
If your organization restricts IP addresses, logins from untrusted IPs are blocked until theyre activated. Salesforce automatically sends
you an activation email that you can use to log in. The email contains a security token that you must add to the end of your password.
For example, if your password is mypassword, and your security token is XXXXXXXXXX, you must enter
mypasswordXXXXXXXXXX to log in.
12
4. Choose an object. For example, if you are inserting Account records, select Account. If your object name does not display in the
default list, check Show all objects to see a complete list of the objects that you can access. The objects are listed by localized
label name, with developer name noted in parentheses. For object descriptions, see the Object Reference for Salesforce and Force.com.
5. Click Browse... to select your CSV file. For example, if you are inserting Account records, you could specify a CSV file named
insertaccounts.csv containing a Name column for the names of the new accounts.
6. Click Next. After the object and CSV file are initialized, click OK.
7. If you are performing an upsert:
a. Your CSV file must contain a column of ID values for matching against existing records. The column may be either an external
ID (a custom field with the External ID attribute), or Id (the Salesforce record ID). From the drop-down list, select which field
to use for matching. If the object has no external ID fields, Id is automatically used. Click Next to continue.
b. If your file includes the external IDs of an object that has a relationship to your chosen object, enable that external ID for record
matching by selecting its name from the drop-down list. If you make no selection here, you can use the related object's Id
field for matching by mapping it in the next step. Click Next to continue.
8. Define how the columns in your CSV file map to Salesforce fields. Click Choose an Existing Map to select an existing field mapping,
or click Create or Edit a Map to create a new map or modify an existing map. For more details and an example of usage, see Define
Data Loader Field Mappings on page 12.
9. Click Next.
10. For every operation, the Data Loader generates two unique CSV log files; one file name starts with success, while the other starts
with error. Click Browse... to specify a directory for these files.
11. Click Finish to perform the operation, and then click Yes to confirm.
12. As the operation proceeds, a progress information window reports the status of the data movement.
13. After the operation completes, a confirmation window summarizes your results. Click View Successes to view your success file,
click View Errors to open your errors file, or click OK to close. For more information, see Reviewing Data Loader Output Files on
page 16.
Tip:
If you are updating or deleting large amounts of data, review Perform Mass Updates and Performing Mass Deletes for tips and
best practices.
There is a five-minute limit to process 100 records when the Bulk API is enabled. Also, if it takes longer than 10 minutes to
process a file, the Bulk API places the remainder of the file back in the queue for later processing. If the Bulk API continues to
exceed the 10-minute limit on subsequent attempts, the file is placed back in the queue and reprocessed up to 10 times before
the operation is permanently marked as failed. Even if the processing failed, some records could have completed successfully,
so you must check the results. If you get a timeout error when loading a file, split your file into smaller files, and try again.
13
EDITIONS
Available in: both Salesforce
Classic and Lightning
Experience
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
EDITIONS
1. As a backup measure, export the records you wish to delete, being sure to select all fields. (See
Export Data on page 10.) Save an extra copy of the generated CSV file.
2. Next, export the records you wish to delete, this time using only the record ID as the desired
criterion.
3. Launch the Data Loader and follow the delete or hard delete wizard. Map only the ID column.
See Insert, Update, or Delete Data Using Data Loader on page 12.
4. After the operation, review your success and error log files. See Reviewing Data Loader Output
Files on page 16.
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
Uploading Attachments
You can use Data Loader to upload attachments to Salesforce. Before uploading attachments, note the following:
If you intend to upload via the Bulk API, verify that Upload Bulk API Batch as Zip File on the Settings > Settings
page is enabled.
If you are migrating attachments from a source Salesforce organization to a target Salesforce organization, begin by requesting a
data export for the source organization. On the Schedule Export page, make sure to select the Include Attachments...
checkbox, which causes the file Attachment.csv to be included in your export. You can use this CSV file to upload the
attachments. For more information on the export service, see Exporting Backup Data in the Salesforce Help.
To upload attachments:
1. Confirm that the CSV file you intend to use for attachment importing contains the following required columns (each column represents
a Salesforce field):
ParentId - the Salesforce ID of the parent record.
Name - the name of the attachment file, such as myattachment.jpg.
Body - the absolute path to the attachment on your local drive.
14
Ensure that the values in the Body column contain the full file name of the attachments as they exist on your computer. For
example, if an attachment named myattachment.jpg is located on your computer at C:\Export, Body must specify
C:\Export\myattachment.jpg. Your CSV file might look like this:
ParentId,Name,Body
50030000000VDowAAG,attachment1.jpg,C:\Export\attachment1.gif
701300000000iNHAAY,attachment2.doc,C:\Export\files\attachment2.doc
The CSV file can also include other optional Attachment fields, such as Description.
2. Proceed with an insert or upsert operation; see Insert, Update, or Delete Data Using Data Loader on page 12. At the Select
data objects step, make sure to select the Show all Salesforce objects checkbox and the Attachment
object name in the list.
EDITIONS
If you intend to upload via the Bulk API, verify that Upload Bulk API Batch as Zip
File on the Settings > Settings page is enabled.
When you upload a document from your local drive using Data Loader, specify the path in the
VersionData and PathOnClient fields in the CSV file. VersionData identifies
the location and extracts the format, and PathOnClient identifies the type of document
being uploaded.
Available in:
Enterprise
Performance
Unlimited
Developer
When you upload a link using the Data Loader, specify the URL in ContentUrl. Dont use
PathOnClient or VersionData to upload links.
You cant export content using the Data Loader.
If youre updating content that youve already uploaded:
Perform the Insert function.
Include a ContentDocumentId column with an 18-character ID. Salesforce uses this information to determine that youre
updating content. When you map the ContentDocumentId, the updates are added to the content file. If you dont include
the ContentDocumentId, the content is treated as new, and the content file isnt updated.
1. Create a CSV file with the following fields.
Title - file name.
Description - (optional) file or link description.
Note: If there are commas in the description, use double quotes around the text.
VersionData - complete file path on your local drive (for uploading documents only).
Note: Files are converted to base64 encoding on upload. This action adds approximately 30% to the file size.
PathOnClient - complete file path on your local drive (for uploading documents only).
ContentUrl - URL (for uploading links only).
OwnerId - (optional) file owner, defaults to the user uploading the file.
FirstPublishLocationId - library ID.
15
To determine the RecordTypeId values for your organization using the AJAX Toolkit:
a. Log in to Salesforce.
b. Enter this URL in your browser:
http://instanceName.salesforce.com/soap/ajax/36.0/debugshell.html. Enter the
instanceName, such as na1, for your organization. You can see the instanceName in the URL field of your browser
d. Press Enter.
e. Click the arrows for recordTypeInfos.
The RecordTypeId values for your organization are listed.
TagsCsv - (optional) tag.
A sample CSV file is:
Title,Description,VersionData,PathOnClient,OwnerId,FirstPublishLocationId,RecordTypeId,TagsCsv
testfile,"This is a test file, use for bulk
upload",c:\files\testfile.pdf,c:\files\testfile.pdf,005000000000000,058700000004Cd0,012300000008o2sAQG,one
2. Upload the CSV file for the ContentVersion object (see Insert, Update, or Delete Data Using Data Loader on page 12). All documents
and links are available in the specified library.
EDITIONS
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
16
The success file contains all of the records that were successfully loaded. In this file, there's a column for the newly generated record
IDs. The error file contains all of the records that were rejected from the load operation. In this file, there's a column that describes
why the load failed.
5. Click Close to return to the CSV Chooser window, and then click OK to exit the window.
Note: To generate success files when exporting data, select the Generate status files for exports setting. For
more information, see Configure Data Loader on page 4.
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
If you are having login issues from the command line, ensure that the password provided in the configuration parameters is encrypted.
If you are having login issues from the UI, you may need to obtain a new security token.
17
Installed Directories
and Files
Encrypt from the
Command Line
Data Loader
Command-Line
Interface
Configure Batch
Processes
Data Loader
Command-Line
Operations
Configure Database
Access
Map Columns
18
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
conf
The default configuration directory. Contains the configuration files config.properties,
Loader.class, and log-conf.xml.
The config.properties file that is generated when you modify the Settings dialog in the graphical user interface is located
at C:\Documents and Settings\your Windows username\Application Data\Salesforce\Data
Loader version_number. You can copy this file to the conf installation directory to use it for batch processes.
The log-conf.xml file is included with version 35.0 of the Data Loader for Windows installer. The log-conf.xml is located
at %LOCALAPPDATA%\salesforce.com\Data Loader\samples\conf\log-conf.xml for the current user,
and C:\Program Files (x86)\salesforce.com\Data Loader\samples\conf\log-conf.xml for all
users.
samples
Contains subdirectories of sample files for reference.
19
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
Generate a key
Key text is generated on screen from the text you provide. Carefully copy the key text to a key file, omitting any leading or trailing
spaces. The key file can then be used for encryption and decryption.
Encrypt text
Generates an encrypted version of a password or other text. Optionally, you can provide a key file for the encryption. In the
configuration file, make sure that the encrypted text is copied precisely and the key file is mentioned.
Verify encrypted text
Given encrypted and decrypted versions of a password, verifies whether the encrypted password provided matches its decrypted
version. A success or failure message is printed to the command line.
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
20
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
For more information about using process.bat, see Run Individual Batch Processes on page 36.
To view tips and instructions, add -help to the command contained in process.bat.
Data Loader runs whatever operation, file, or map is specified in the configuration file that you specify. If you do not specify a configuration
directory, the current directory is used. By default, Data Loader configuration files are installed at the following location:
C:\Program Files\Salesforce\Data Loader version number\conf
You use the process-conf.xml file to configure batch processing. Set the name of the process in the bean element's id attribute:
(for example <bean id="myProcessName">).
If you want to implement enhanced logging, use a copy of log-conf.xml.
You can change parameters at runtime by giving param=value as program arguments. For example, adding
process.operation=insert to the command changes the configuration at runtime.
You can set the minimum and maximum heap size. For example, -Xms256m -Xmx256m sets the heap size to 256 MB.
Note: These topics only apply to Data Loader version 8.0 and later.
Tip: If you experience login issues in the command line interface after upgrading to a new version of Data Loader, please try
re-encrypting your password to solve the problem. For information on the password encryption utility, see Encrypt from the
Command Line on page 19.
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
enableLastRunOutput
If set to true (the default), output files containing information about the last run, such as
sendAccountsFile_lastrun.properties, are generated and saved to the location specified by
lastRunOutputDirectory. If set to false, the files are not generated or saved.
lastRunOutputDirectory
The directory location where output files containing information about the last run, such as
sendAccountsFile_lastrun.properties, are written. The default value is \conf. If enableLastRunOutput
is set to false, this value is not used because the files are not generated.
The configuration backing file stores configuration parameter values from the last run for debugging purposes, and is used to load
default configuration parameters in config.properties. The settings in configOverrideMap take precedence over those
in the configuration backing file. The configuration backing file is managed programmatically and does not require any manual edits.
21
For the names and descriptions of available process configuration parameters, see Data Loader Process Configuration Parameters on
page 22.
dataAccess.readUTF8
Read
all
CSVs
with
UTF-8
boolean encoding
dataAccess.writeUTF8
Write
all
CSVs
Select this option to force files to be
with
written in UTF-8 encoding.
UTF-8
boolean encoding Sample value: true
dataAccess.name
Not
applicable Sample value:
c:\dataloader\data\extractLead.csv
string (N/A)
Number of records read from the
database at a time. The maximum
value is 200.
dataAccess.type
string N/A
Sample value: 50
Standard or custom data source type.
Standard types are csvWriter,
22
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
Parameter Name
Data
Type
Equivalent
Option in
Settings
Dialog
Description
integer
process.enableExtractStatusOutput boolean
N/A
Generate
status
files
for
exports
process.enableLastRunOutput
process.encryptionKeyFile
boolean
N/A
string (file
name)
N/A
process.initialLastRunDate
date
N/A
23
Parameter Name
Data
Type
Equivalent
Option in
Settings
Dialog
Description
Format must be
yyyy-MM-ddTHH:mm:ss.SSS+/-HHmm. For
example: 2006-04-13T13:50:32.423-0700
When running Data Loader in batch mode, you can
change the location where output files such as
sendAccountsFile_lastRun.properties
process.lastRunOutputDirectory
string
(directory) N/A
process.loadRowToStartAt
number
Start at
row
process.mappingFile
string (file
name)
N/A
Sample value:
c:\dataloader\conf\accountExtractMap.sdl
process.statusOutputDirectory
string
N/A
string
(directory) N/A
process.outputError
string (file
name)
N/A
Sample value:
c:\dataloader\status\myProcessErrors.csv
The name of the CSV file that stores success data from
the last operation. See also
process.outputSuccess
string (file
name)
N/A
24
process.enableExtractStatusOutput
on page 23.
Parameter Name
Data
Type
Equivalent
Option in
Settings
Dialog
Description
Sample value:
c:\dataloader\status\myProcessSuccesses.csv
process.useEuropeanDates
boolean
Use
European
date
format
sfdc.assignmentRule
string
Assignment
Sample value: 03Mc00000026J7w
rule
sfdc.bulkApiSerialMode
integer
boolean
N/A
Enable
serial
mode for
Bulk API
boolean
Batch as
Zip File
sfdc.connectionTimeoutSecs
integer
N/A
25
Sample value: 60
Parameter Name
Data
Type
Equivalent
Option in
Settings
Dialog
Description
boolean
N/A
sfdc.debugMessagesFile
string (file
name)
N/A
Sample value:
\lexiloader\status\sfdcSoapTrace.log
boolean
N/A
sfdc.endpoint
URL
Server
host
string
N/A
sfdc.externalIdField
sfdc.extractionRequestSize
string
integer
N/A
Query
request
size
26
Parameter Name
Data
Type
Equivalent
Option in
Settings
Dialog
Description
sfdc.insertNulls
string
boolean
N/A
Insert
null
values
sfdc.loadBatchSize
integer
Batch
size
sfdc.maxRetries
integer
N/A
Sample value: 3
The minimum number of seconds to wait between
connection retries. The wait time increases with each
try. See sfdc.enableRetries on page 26.
sfdc.minRetrySleepSecs
integer
N/A
Sample value: 2
Compression enhances the performance of Data
Loader and is turned on by default. You may want to
disable compression if you need to debug the
underlying SOAP messages. To turn off compression,
enable this option.
sfdc.noCompression
sfdc.password
boolean
encrypted
string
N/A
27
Parameter Name
Data
Type
Equivalent
Option in
Settings
Dialog
Description
sfdc.proxyHost
URL
Proxy
host
Sample value:
http://myproxy.internal.company.com
sfdc.proxyPassword
encrypted Proxy
password Sample value: 4285b36161c65a22
string
The proxy server port.
sfdc.proxyPort
integer
Proxy
port
string
Proxy
username
sfdc.proxyUsername
sfdc.resetUrlOnLogin
boolean
Reset
URL on
Login
sfdc.timeoutSecs
integer
Timeout
sfdc.timezone
string
Time
Zone
28
Parameter Name
Data
Type
Equivalent
Option in
Settings
Dialog
Description
sfdc.truncateFields
boolean
Allow
field
truncation Sample value: true
boolean
Use Bulk
API
sfdc.username
string
N/A
29
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
Insert
Loads data from a data source into Salesforce as new records.
Update
Loads data from a data source into Salesforce, where existing records with matching ID fields are updated.
Upsert
Loads data from a data source into Salesforce, where existing records with a matching custom external ID field are updated; records
without matches are inserted as new records.
Delete
Loads data from a data source into Salesforce, where existing records with matching ID fields are deleted.
Hard Delete
Loads data from a data source into Salesforce, where existing records with matching ID fields are deleted without being stored first
in the Recycle Bin.
DatabaseConfig Bean
The top-level database configuration object is the DatabaseConfig bean, which has the
following properties:
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
sqlConfig
The SQL configuration bean for the data access object that interacts with a database.
dataSource
The bean that acts as database driver and authenticator. It must refer to an implementation of javax.sql.DataSource such
as org.apache.commons.dbcp.BasicDataSource.
30
DataSource
The DataSource bean sets the physical information needed for database connections. It contains the following properties:
driverClassName
The fully qualified name of the implementation of a JDBC driver.
url
The string for physically connecting to the database.
username
The username for logging in to the database.
password
The password for logging in to the database.
Depending on your implementation, additional information may be required. For example, use
org.apache.commons.dbcp.BasicDataSource when database connections are pooled.
The following code is an example of a DataSource bean:
<bean id="oracleRepDataSource"
class="org.apache.commons.dbcp.BasicDataSource"
destroy-method="close">
<property name="driverClassName" value="oracle.jdbc.driver.OracleDriver"/>
<property name="url" value="jdbc:oracle:thin:@myserver.salesforce.com:1521:TEST"/>
<property name="username" value="test"/>
<property name="password" value="test"/>
</bean>
Versions of Data Loader from API version 25.0 onwards do not come with an Oracle JDBC driver. Using Data Loader to connect to an
Oracle data source without a JDBC driver installed will result in a Cannot load JDBC driver class error. To add the Oracle JDBC driver to
Data Loader:
Download the latest JDBC driver from
http://www.oracle.com/technetwork/database/features/jdbc/index-091264.html.
31
Spring Framework
Spring Framework
Note: The Data Loader command-line interface is supported for Windows only.
The Data Loader configuration files are based on the Spring Framework, which is an open-source,
full-stack Java/J2EE application framework.
The Spring Framework allows you to use XML files to configure beans. Each bean represents an
instance of an object; the parameters correspond to each object's setter methods. A typical bean
has the following attributes:
id
Uniquely identifies the bean to XmlBeanFactory, which is the class that gets objects from
an XML configuration file.
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
class
Specifies the implementation class for the bean instance.
For more information on the Spring Framework, see the official documentation and the support forums. Note that Salesforce cannot
guarantee the availability or accuracy of external websites.
32
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
SQL Configuration
SQL Configuration
Note: The Data Loader command-line interface is supported for Windows only.
When running Data Loader in batch mode from the command line, the SqlConfig class contains
configuration parameters for accessing specific data in the database. As shown in the code samples
below, queries and inserts are different but very similar. The bean must be of type
com.salesforce.dataloader.dao.database.SqlConfig and have the following
properties:
sqlString
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
sqlParams
A property of type map that contains descriptions of the replacement parameters specified in sqlString. Each entry represents
one replacement parameter: the key is the replacement parameter's name, the value is the fully qualified Java type to be used when
the parameter is set on the SQL statement. Note that java.sql types are sometimes required, such as java.sql.Date instead
of java.util.Date. For more information, see the official JDBC API documentation.
columnNames
Used when queries (SELECT statements) return a JDBC ResultSet. Contains column names for the data outputted by executing
the SQL. The column names are used to access and return the output to the caller of the DataReader interface.
33
SQL Configuration
34
Map Columns
Map Columns
Note: The Data Loader command-line interface is supported for Windows only.
When running Data Loader in batch mode from the command line, you must create a properties
file that maps values between Salesforce and data access objects.
1. Create a new mapping file and give it an extension of .sdl.
2. Observe the following syntax:
On each line, pair a data source with its destination.
In an import file, put the data source on the left, an equals sign (=) as a separator, and the
destination on the right. In an export file, put the destination on the left, an equals sign (=)
as a separator, and the data source on the right.
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
Data sources can be either column names or constants. Surround constants with double
quotation marks, as in sampleconstant. Values without quotation marks are treated as column names.
Destinations must be column names.
You may map constants by surrounding them with double quotation marks, as in:
"Canada"=BillingCountry
3. In your configuration file, use the parameter process.mappingFile to specify the name of your mapping file.
Note: If your field name contains a space, you must escape the space by prepending it with a backslash (\). For example:
Account\ Name=Name
35
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
Process Example
process ../conf accountMasterProcess
Note: You can configure external process launchers such as the Microsoft Windows XP Scheduled Task Wizard to run processes
on a schedule.
36
Data Loader
Introduction
Prerequisites
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
37
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
Step 4: Create a process-conf.xml file that contains the import configuration settings
Step 5: Run the process and import the data
SEE ALSO:
Prerequisites
Prerequisites
Note: The Data Loader command-line interface is supported for Windows only.
To step through this quick start requires the following:
Data Loader installed on the computer that runs the command-line process.
The Java Runtime Environment (JRE) installed on the computer that runs the command-line
process.
Familiarity with importing and exporting data by using the Data Loader interactively through
the user interface. This makes it easier to understand how the command-line functionality
works.
Tip: When you install Data Loader, sample files are installed in the samples directory. This
directory is found below the program directory, for example, C:\Program Files
(x86)\salesforce.com\Apex Data Loader 22.0\samples\. Examples
of files that are used in this quick start can be found in the \samples\conf directory.
SEE ALSO:
Data Loader Introduction
Step One: Create the Encryption Key
38
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
2. In the command window, enter cd\ to navigate to the root directory of the drive where Data
Loader is installed.
3. Navigate to the Data Loader \bin directory by entering this command. Be sure to replace the file path with the path from your
system.
cd C:\Program Files (x86)\salesforce.com\Apex Data Loader 22.0\bin
4. Create an encryption key by entering the following command. Replace <seedtext> with any string.
encrypt.bat g <seedtext>
Note: To see a list of command-line options for encrypt.bat, type encrypt.bat from the command line.
5. Copy the generated key from the command window to a text file named key.txt and make a note of the file path. In this example,
the generated key is e8a68b73992a7a54.
Note: Enabling quick edit mode on a command window can make it easier to copy data to and from the window. To enable
quick edit mode, right-click the top of the window and select Properties. On the Options tab, select QuickEdit Mode.
The encryption utility is used to encrypt passwords, but data that you transmit using Data Loader is not encrypted.
SEE ALSO:
Data Loader Introduction
Step Two: Create the Encrypted Password
39
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
2. Copy the encrypted password that is generated by the command. You use this value in a later step.
SEE ALSO:
Data Loader Introduction
Step Three: Create the Field Mapping File
#Mapping values
#Thu May 26 16:19:33 GMT 2011
Name=Name
NumberOfEmployees=NumberOfEmployees
Industry=Industry
40
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
Tip: For complex mappings, you can use the Data Loader user interface to map source and destination fields and then save
those mappings to an .sdl file. This is done on the Mapping dialog box by clicking Save Mapping.
SEE ALSO:
Data Loader Introduction
Step Four: Create the Configuration File
EDITIONS
Available in: Salesforce
Classic
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
41
<entry key="process.mappingFile"
value="C:\DLTest\Command Line\Config\accountInsertMap.sdl"/>
<entry key="dataAccess.name"
value="C:\DLTest\In\insertAccounts.csv"/>
<entry key="process.outputSuccess"
value="c:\DLTest\Log\accountInsert_success.csv"/>
<entry key="process.outputError"
value="c:\DLTest\Log\accountInsert_error.csv"/>
<entry key="dataAccess.type" value="csvRead"/>
<entry key="process.initialLastRunDate"
value="2005-12-01T00:00:00.000-0800"/>
</map>
</property>
</bean>
</beans>
3. Modify the following parameters in the process-conf.xml file. For more information about the process configuration
parameters, see Data Loader Process Configuration Parameters on page 22.
sfdc.endpointEnter the URL of the Salesforce instance for your organization; for example,
https://na1.salesforce.com.
sfdc.usernameEnter the username Data Loader uses to log in.
sfdc.passwordEnter the encrypted password value that you created in step 2.
process.mappingFileEnter the path and file name of the mapping file.
dataAccess.NameEnter the path and file name of the data file that contains the accounts that you want to import.
sfdc.debugMessagesCurrently set to true for troubleshooting. Set this to false after your import is up and running.
sfdc.debugMessagesFileEnter the path and file name of the command line log file.
process.outputSuccessEnter the path and file name of the success log file.
process.outputErrorEnter the path and file name of the error log file.
Warning: Use caution when using different XML editors to edit the process-conf.xml file. Some editors add XML
tags to the beginning and end of the file, which causes the import to fail.
SEE ALSO:
Data Loader Introduction
Step Five: Import the Data
42
EDITIONS
To insert records:
To update records:
To upsert records:
To delete records:
Note: The Data Loader command-line interface is supported for Windows only.
Now that all the pieces are in place, you can run Data Loader from the command line and insert some new accounts.
1. Copy the following data to a file name accountInsert.csv. This is the account data that you import into your organization.
Name,Industry,NumberOfEmployees
Dickenson plc,Consulting,120
GenePoint,Biotechnology,265
Express Logistics and Transport,Transportation,12300
Grand Hotels & Resorts Ltd,Hospitality,5600
Replace <file path to process-conf.xml> with the path to the directory containing process-conf.xml.
Replace <process name> with the process specified in process-conf.xml.
Your command should look something like this:
process.bat "C:\DLTest\Command Line\Config" accountInsert
After the process runs, the command prompt window displays success and error messages. You can also check the log files:
insertAccounts_success.csv and insertAccounts_error.csv. After the process runs successfully, the
insertAccounts_success.csv file contains the records that you imported, along with the ID and status of each record.
For more information about the status files, see Reviewing Data Loader Output Files on page 16.
SEE ALSO:
Data Loader Introduction
43
Version
Number
License
http://www.apache.org/licenses/LICENSE-2.0
Apache Commons
Collections
3.1
http://www.apache.org/licenses/LICENSE-2.0
Apache Commons
Database Connection
Pooling (DBCP)
1.2.1
http://www.apache.org/licenses/LICENSE-2.0
Apache Commons
Logging
1.0.3
http://www.apache.org/licenses/LICENSE-1.1
http://www.apache.org/licenses/LICENSE-2.0
Apache Log4j
1.2.8
http://www.apache.org/licenses/LICENSE-2.0
Eclipse SWT
3.452
http://www.eclipse.org/legal/epl-v10.html
OpenSymphony Quartz
Enterprise Job Scheduler
1.5.1
http://www.opensymphony.com/quartz/license.action
1.6R2
http://www.mozilla.org/MPL/MPL-1.1.txt
Spring Framework
1.2.6
http://www.apache.org/licenses/LICENSE-2.0.txt
Note: Salesforce is not responsible for the availability or content of third-party websites.
44
EDITIONS
Available in: both Salesforce
Classic and Lightning
Experience
Available in:
Enterprise
Performance
Unlimited
Developer
Database.com
INDEX
Data Loader (continued)
configuring 4, 7
configuring batch processes 21
data types 9
Database Access 30
date formats 9
encrypted password (command line) 40
encryption key (command line) 39
field mapping file (command line) 40
importing data (command line) 43
installed files 19
installing 3
JDBC Driver 30
overview 1
password encryption 19
prerequisites (command line) 38
sample files 19
settings 7
Spring Framework 32
starting batch processes 36
system requirements 3
third-party licenses 44
troubleshooting 17
uploading 15
uploading attachments 14
using 7
Using 8
when to use 2
A
Apex Data Loader
See Data Loader 1
B
Bulk API
uploading attachments 14
C
Command line
configuration file (Data Loader) 41
encrypted password (Data Loader) 40
encryption key (Data Loader) 39
field mapping file (Data Loader) 40
importing data (Data Loader) 43
introduction (Data Loader) 38
prerequisites (Data Loader) 38
quick start (Data Loader) 37
D
Data
import limits 17
Data Loader
attachments 7
batch files 19
batch mode 18
batch mode parameters 22
Bulk API 4, 7, 15
column mapping 35
command line interface 20
command line introduction 38
command line operations 30
command line quick start 37
config.properties 22
configuration file (command line) 41
L
Limits
data import 17
S
Spring Framework, see Data Loader 32
45