Print this page

Best Practices to Avoid Excessive SOAP and REST API DML

Knowledge Article Number 000212718
Description When developing integration applications using the SOAP API or REST API, you should make sure your code is as efficient as possible to avoid poor performance or API limit issues due to excessive API calls. See the following best practices for tips on making your integration code as efficient as possible.

The SOAP API and REST API provide powerful, convenient Web Services interfaces for interacting with Salesforce. A common use of these APIs is in integration projects, where you are integrating Salesforce with your existing systems. Integration projects use the SOAP API or REST API to make changes to Salesforce data, including creating, updating or deleting records.

While these APIs are capable of handling large record sets, you should follow best-practices to ensure your code is as efficient as possible. Inefficient code could result in excessive API calls, which could result in poor performance, or possible Salesforce limits issues.

This article provides some suggestions on how to improve the performance of your code to handle creation, update or deletion of large record sets efficiently.

Use Incremental Processing

Incremental processing involves working with just the data that changed and only updating changed data in Salesforce. For example, you could use the SystemModstamp date-time field or a custom field to mark records that need to be modified, and only update those records in Salesforce, instead of doing a wholesale update.

If your integration code is doing large updates without taking advantage of incremental processing, see if you can take advantage of incremental processing to greatly improve performance and reduce the number of API operations.

See Incremental Processing for Heavy Bulk API / DML Use Cases for use cases, best practices and examples (the article applies to SOAP API and REST API as well as Bulk API)

Bulkify your SOAP API Code

Make sure your SOAP API calls process records in batches, rather than one at a time. Processing records one at a time incurs extra overhead for each call, which degrades performance, and also uses up more API calls against your current daily limit.

A common indicator that you might not be bulkifying your SOAP API calls is using calls inside something like a for loop. All SOAP API data modification calls, such as create(), delete(), update() and upsert() can take an array of records to modify. Where possible, batch your changes in a local array and pass the array of changes in a single call. As a simple example, if you have a set of Accounts to create, rather than calling create() for each record in a loop, populate an array with all the new account and pass that array into a single create() call:


Account[] accounts;


// populate accounts array with all new account records 


// create all accounts in Salesforce with a single call to create()

SaveResult[] saveResults = connection.create(accounts);

Note that your array cannot exceed 200 records in a single call.

The array you pass to data modification calls is not limited to records of a single object type. You can use an SObject array to collect records of multiple object types, and use this array in a single call. This is useful when modifying related records. For example, you could use an SObject array and foreign keys to create a parent and child record in a single call to create():

Opportunity newOpportunity = new Opportunity();
Calendar dt = connection.getServerTimestamp().getTimestamp();
dt.add(Calendar.DAY_OF_MONTH, 7);
// Create the parent reference
// Used only for foreign key reference and doesn't contain other fields
Account accountReference = new Account();
// Create the Account object to insert
// Same as above but has Name field
// Used for the create call
Account parentAccount = new Account();

SObject[] createObjects = { parentAccount, newOpportunity };

// Create the account and the opportunity
SaveResult[] results = connection.create(createObjects);

Note that you cannot refer to more than 10 different object types in a single call.

See the following articles for more details on bulkifying your SOAP API calls:

SOAP API Developer’s Guide (see reference topics on create(), delete(), update() and upsert())
Cascade Insert with External ID Fields
delete() via API Call (see Basic Steps for Deleting Records)

Note that you cannot bulkify REST API calls. If you are doing large bulk operations, consider using the Bulk API.

Check for Data Skew Impact

A large amount of related records might cause performance issues with DML operations. For example, if a client application uses a delete() call to delete an Opportunity, then any associated OpportunityLineItem records are also deleted. Large amounts of related records can also affect performance during DML operations due to sharing or record locking performance hits with the related records. As another example, deleting a large number of master records in a master-detail relationship with many children will incur a cascade delete performance hit.

If you’re seeing performance impact due to lots of related records, this might be a sign of data skew. Typically a record with more than 10,000 related records can cause data skew performance problems. Consider performing data management to address the data skew:

  • Can any child records be removed?
  • Can the relationship type be changed to avoid cascade operations (lookup instead of master-detail)?

See Avoid Account Data Skew for Peak Performance for more information on data skew.

Be Aware of Operations that Cause Cascading Changes

Certain operations can trigger cascade operations that result in more data operations than expected. Record changes can trigger post-operation actions, including workflow rules, Apex triggers, and assignment rules. These post-operation actions can in turn generate additional data operations. In some scenarios, this might be wasted effort. For example, you might have an Apex trigger that updates Opportunities related to an Account. If you do a large-scale replacement of Account records followed by a large-scale replacement of Opportunity records, the trigger updates after the Account record replacement will be wasted work. Consider disabling post-operation actions during large updates if you know in advance that the actions will not be needed.

See Extreme Data Loading Part 3: Suspending Events that Fire on Insert for more examples and suggestions on when to disable post-operation actions to make updates faster and more efficient.

Check for parent roll-up summary fields

If a master object of a detail object being modified has a roll-up summary field, SOAP API DML operations on the child object might be incurring a performance hit due to parent roll-up summary field recalculations. Also, recalculating the roll-up summary field may trigger other workflow rules and field validations, which can add to the performance hit.

See if your roll-up summary fields are still needed, or if they can be adjusted, or if triggered workflow, validations, etc can be removed or reduced. See the following for more details on addressing roll-up summary field performance:

About Roll-Up Summary Fields

promote demote