BEST PRACTICES: Best Practices for Deployments with Large Data Volumes
|Knowledge Article Number||000221698|
|Description||Who Should Read This
This paper is for experienced application architects who work with Salesforce deployments that contain large data volumes. A “large data volume” is an imprecise, elastic term, but if your deployment has tens of thousands of users, tens of millions of records, or hundreds of gigabytes of total record storage, then you can use the information in this paper. A lot of that information also applies to smaller deployments, and if you work with those, you might still learn something from this document and its best practices. To understand the parts of this paper that deal with details of Salesforce implementation, read https://developer.salesforce.com/page/Multi_Tenant_Architecture.
Salesforce enables customers to easily scale their applications up from small to large amounts of data. This scaling usually happens automatically, but as data sets get larger, the time required for certain operations might grow. The ways in which architects design and configure data structures and operations can increase or decrease those operation times by several orders of magnitude.
The main processes affected by differing architectures and configurations are the:
What’s in This Paper
Link to full guide: http://www.salesforce.com/docs/en/cce/ldv_deployments/salesforce_large_data_volumes_bp.pdf