0% found this document useful (0 votes)
16 views1 page

Best Practices For Data Import

The document provides best practices for preparing data import files including using spreadsheet programs to create CSV files, checking for duplicate or empty headers, ensuring date fields are formatted correctly, and dealing with large files over 200 rows or 50MB by using the Bulk API option or running imports outside business hours.

Uploaded by

koushik.chak2016
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views1 page

Best Practices For Data Import

The document provides best practices for preparing data import files including using spreadsheet programs to create CSV files, checking for duplicate or empty headers, ensuring date fields are formatted correctly, and dealing with large files over 200 rows or 50MB by using the Bulk API option or running imports outside business hours.

Uploaded by

koushik.chak2016
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 1

Best practices for Data Import

Although dataloader.io is quite flexible at understanding your import data, there are certain formatting details
you need to be careful with when preparing your import files. Sometimes a seemingly insignificant hiccup can
lead to the entire file becoming unreadable to a machine. By following these simple best practices you can
safely avoid running into that situation:

 Use a spreadsheet program such as Microsoft Excel to create your CSV file.
 You can also verify your CSV file with a Text Editor.
 You can use csvlint.io to check the CSV file.
 Save the CSV file in UTF-8 format.
 Make sure you don't have any duplicated or empty headers.
 Make sure you don't have any empty row in the file or at the bottom of the file (empty rows at the
bottom of the file are not visible in Excel, use a Text editor to view them)
 The first row of your file should contain the column headers, such as First Name, Last Name or Profit.
 The import process is easier when the file structure resembles the object structure in Salesforce. The
column headers are automatically mapped to Salesforce fields, and you do not need to manually map
them.
 Make sure all your date fields are formatted correctly.
 You can use double quotes (") as field delimiter.
 Keep your files at a maximum size of 50MB.
 When dealing with files that are larger than 200 rows, pick the Bulk API option in the final window of the
create task wizard. This will give you an optimum performance for large files.
 If any errors do still occur, you can always check the results files that are generated when tasks are run,
they can be a big help in understanding what went wrong. You can see and download these files on the
tasks list page, next to each task.
 Run any import/update/upsert tasks being careful to not overload your Salesforce org: sometimes
(depending on the amount of data you will be processing) you might want to run heavy load jobs
outside business hours to prevent resource starvation on the Salesforce side. If too many users are
hitting your Salesforce org at the same time you are trying to load the data, it might take too much time
to finish processing and you may risk hitting the 2hr execution limit in dataloader.io (after 2hr of
execution the tasks are automatically expired).

You might also like