Loading
Salesforce now sends email only from verified domains. Read More
About Salesforce Data 360
Table of Contents
Select Filters

          No results
          No results
          Here are some search tips

          Check the spelling of your keywords.
          Use more general search terms.
          Select fewer filters to broaden your search.

          Search all of Salesforce Help
          . Data 360 Limits and Guidelines

          Data 360 Limits and Guidelines

          Guidelines define best practice recommendations to optimize your adoption of Data 360 for best performance. Limits are boundaries beyond which features are unavailable, performance is throttled, or usage billing charges are applied. Since Data 360 is built to scale, some limits can be adjusted to meet your business needs. Work with your account executive to find a solution that meets your goals.

          Some limits and guidelines vary by edition or org type, such as when using Data 360 in a Developer org. For limits and guidelines related to Data 360 in a Developer org, see Developer Edition Limits and Guidelines for Data Cloud. Limits that aren't designated as hard limits indicate the point at which exceeding the limit may negatively impact performance, functionality, or usability. Hard limits, on the other hand, can't be exceeded in a regularly provisioned org. Contact your Account Executive to request an increase of the hard limit where indicated.

          For limits related to Data 360 Reports, see Data 360 Reports and Dashboards: Limits and Limitations.

          Note
          Note Some limits, guidelines, and usage billing calculations differ depending on your org’s license.

          For billing information, refer to this additional document if your Data 360 org is operating under a Data Cloud license.

          Refer to these documents instead if your Data 360 org is operating under a Customer Data Platform license.

          Contact your account executive if you aren’t sure which documents apply to you.

          General Guidelines and Limits

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Location of data storage    
          Number of Data Cloud permission set licenses
          • 200,000 Data 360 permission sets
          • 200,000 additional Data Cloud Data Aware Specialist permission sets
          Yes

          Learn about Data 360 Standard Permission Sets,

          Permission Set Considerations, and Salesforce Features and Edition Allocations.

          Contact your Account Executive to adjust this limit.

          Number of data spaces
          • Depends on the number of licenses purchased
            Learn more about licenses in Data 360 Standard Editions and Licenses.
          Data Spaces permission sets
          • 1 per data space
          Yes

          Data spaces permission sets can’t be included in permission set groups.

          Data spaces permissions sets aren’t counted against the Number of Data 360 Permission Set Licenses limit.

          Unsupported features
          • Data Kits: Marketing Cloud Engagement data streams aren’t installable
          Yes  
          Unsupported orgs
          • Professional Edition orgs
          • Government Cloud
          Yes  
          Maximum number of Data Cloud One companion connections
          • 3
            You can raise this limit by purchasing more Data Cloud One Companion Org licenses.

          Activation Guidelines and Limits

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Scheduled activation publish frequency for standard segments
          • Minimum: 12 hours
          • Maximum: 24 hours
           

          Activation frequency is tied to the segment publish frequency for standard segments.

          Learn more in Review Publish History of Your Activation.

          Scheduled activation publish frequency for rapid segments
          • Minimum: 1 hour
          • Maximum: 4 hours
            Activation frequency is tied to the segment publish frequency for rapid segments.
          Total number of attributes
          • 100
            Learn more about attributes in Create Activation for a Segment.
          Number of segments to an activation target
          • 1
          Yes Each segment can be sent to an activation target one time. Learn more in Create Activation for a Segment.
          Total number of activation targets
          • 300
            An activation target is information stored for a given activation platform, which is the location that a segment’s data is being sent to during the activation process.
          Total number of activations
          • No set limit
            There’s no limit on how many of your activations can have.
          Total number of activations with related attributes
          • No set limit
            There’s no limit on how many of your activations can have related attributes. Although, there's a limit in CDP. Learn more in Customer Data Platform Guidelines and Limits.
          Total number of related attributes per activation
          • 300 across no more than 4 data model objects in the same path
          Yes Learn more in Considerations for Selecting Related Attributes in Data 360 Activations.
          Maximum records in a segment with related attributes
          • 150 million
          Yes If the segment has more than 150 million records and contains related attributes when you publish, the segment doesn’t activate. Learn more in Considerations for Selecting Related Attributes in Data 360 Activations.
          Maximum levels away from the Activation Membership DMO with related attributes
          • 5
          Yes Choose attributes from DMOs 5 levels away from your selected Activation Membership DMO. Learn more in Considerations for Selecting Related Attributes in Data 360 Activations.
          Maximum number of related attribute records selected
          • 25
          Yes Select the number of records sent for a given attribute in your activation payload.
          Number of ad audience activations to ad audience partners
          • Total number of contracted ad audiences
          Yes

          Ad audience activations are available for purchase.

          The ad audience partners are Google Ads, Meta (Facebook), and Amazon Ads.

          One ad audience allows a customer to activate one segment to an ad audience partner.

          For example, sending one segment to a Meta account and one segment to a Google account equals two ad audiences.

          Maximum number of characters for file storage activation
          • 200
          Yes  

          Calculated Insights Guidelines and Limits

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          How often calculated insights are processed
          • System managed: Multiple times a day
          • Scheduled: Users can configure a schedule that best meets their needs
           

          System managed calculated insights are automatically converted to scheduled calculated insights when the calculated insight is edited.

          With schedules, users can pick a start date, time, and frequency for processing calculated insights. A calculated insight won't run if there are no changes to source data, object mappings, or configurations.

          Maximum number of times a calculated insight can be manually processed in any 24-hour period
          • 30 per calculated insight
          Yes  
          Maximum calculated insight execution time
          • 2 hours
           

          Insights that run for longer than 2 hours may be terminated by the system.

          If a calculated insight runs for longer than two hours, review your data shape and optimize it.

          Maximum number of nested calculated insights
          • 4
          Yes Refers to how many existing insights you can use within a new insight.
          Maximum number of dimensions per calculated insight
          • 10
          Yes A dimension contains a qualitative value, such as a product name, date, or profile ID.
          Maximum number of measures per calculated insight
          • 50
          Yes A measure is a quantitative aggregated value, such as an average or total amount.
          Total number of calculated insights per tenant
          • 300
          Yes

          This limit includes calculated insights in active, inactive, and draft statuses.

          A tenant is a specific org or instance in Salesforce.

          Maximum number of real-time insights
          • 20
          Yes Real-time insights can only be built from real-time data graphs.
          Maximum number of real-time insight fields (dimensions and measures)
          • 15
          Yes

          This limit is for the total number of dimensions and measures in the real-time insight.

          The primary key for the root node of the real-time data graph will automatically be added to the real-time insight and can’t be removed. This dimension will count as one of your max fields.

          Code Extension Guidelines and Limits (Beta)

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Maximum custom code deployment package size (zipped)
          • 5 GB
          Yes  
          Maximum runtime (duration) of a custom script before it's canceled by the system
          • 24 hours
          Yes  
          Retention period of custom code logs stored in a DLO
          • 2 years
          Yes  

          Data Actions Guidelines and Limits

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Total number of data actions per tenant
          • 200
          Yes  
          Total number of data action targets per tenant
          • 100
          Yes This limit is enforced by the UI.
          Total number of rules per data action
          • 10
          Yes Refers to the number of filter rules in a data action definition.
          Maximum number of data graph profile records for enrichment
          • 10 million
            Refers to the maximum number of data graph records that you can enrich with the primary object.
          Total number of real-time data actions per real-time data graph
          • 10
          Yes  

          Data Explorer Guidelines and Limits

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Total number of columns queried at a time
          • 10
          Yes  
          Total number of rows displayed per page
          • 100
          Yes  

          Data Federation Guidelines and Limits

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Maximum number of external data lake objects (DLOs) per org See Data Services Billable Usage Types for Data 360   A limit on the number of external DLOs hasn’t been defined. Usage of external DLOs can impact billing.

          Data Graphs Guidelines and Limits

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Maximum number of data graphs per org
          • Standard data graph: 25
          • Real-time data graph: 25
          Yes  
          Maximum number of draft data graphs per org
          • Standard data graph: 5
          • Real-time data graph: 5
             
          Scheduled data graph refreshes per day
          • Standard data graph: 1

            You can change this default setting

          • Real-time data graph: 24
          Yes

          Standard data graphs can be set to refresh automatically every 30 minutes, four hours, daily, weekly, or monthly. You can also refresh them manually whenever you want.

          Real-time data graphs refresh instantly for active sessions. The lakehouse version of the real-time data graph refreshes every 1 hour.

          Maximum number of records across all data model objects and calculated insights included in a data graph
          • Standard data graph: 200 million
          • Real-time data graph: 100 million
            Including more records in a data graph impacts data graph refresh times and query response times.
          Maximum size of a real-time data graph
          • 200 KB
          Performance may degrade if the size of the data graph exceeds 200 KB. This can affect refresh times and query response speeds.
          Maximum number of objects per data graph
          • Standard data graph: 25
          • Real-time data graph: 25
          Yes The data graph’s primary data model object, related data model objects, and calculated insights count toward this limit.
          Maximum number of fields per data model object that can be included in a data graph
          • Standard data graph: 50
          • Real-time data graph: 50
          Yes Primary key, foreign key, and key qualifier fields aren’t included in this limit.
          Maximum number of measures per calculated insight that can be included in a data graph
          • Standard data graph: 5
          • Real-time data graph: 5
          Yes  
          Maximum number of total fields and measures that can be included in a data graph
          • Standard data graph: 200
          • Real-time data graph: 200
          Yes Primary key, foreign key, and key qualifier fields aren’t included in this limit.
          Maximum number of engagement events in a data graph
          • Standard data graph: 100
          Yes The effective maximum limit on engagement events in a data graph is the lesser of the number of events or age of the event.
          Maximum age of engagement data included in a data graph
          • Standard data graph: 30 days (720 hours)
          • Real-time data graph: 30 days (720 hours)
          Yes The effective maximum limit on engagement events in a data graph is the lesser of the number of events or age of the event.
          Maximum number of levels that data model objects (DMO) can be nested below the primary DMO in a data graph
          • Standard data graph: 5
          • Real-time data graph: 5
          Yes Levels are represented as a tree structure in the data graphs editor. The nesting tree including the primary DMO can have a total of 6 levels.
          Maximum number of records included per DMO
          • Standard data graph: 1,000
          • Real-time data graph: 1,000
          Yes The default limit is 100 records for sorting records per DMO in a data graph. You can adjust this limit up to 1,000 records.

          Data Ingestion Guidelines and Limits

          General

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Maximum number of fields per data stream
          • 1,050
          Yes  
          Maximum number of fields of a single type in a data stream
          • 800
          Yes  
          Total number of CRM custom objects See Data Services Billable Usage Types for Data 360   This limit applies to the number of custom objects ingested from each connected CRM org.
          Total number of CRM orgs that can be connected to Data 360
          • No limit
             
          Total number of data models
          • 7,500
          Yes  
          Total number of data streams
          • 5,000
          Yes  
          Total number of data lake objects (DLOs)
          • 5,000
          Yes This limit applies to data lake objects created as part of data streams or in a standalone manner.
          Total Number of Marketing Cloud Personalization (Interaction Studio) datasets per instance
          • 5
             
          Maximum number of rows for a Marketing Cloud Engagement Data Extension full extract
          • 100 million
          Yes  

          GCS, Azure, and S3 Bulk Ingestion

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Optimal number of file rows when creating S3 data streams
          • 1,000
            When creating an S3 data stream, you may need to use a file smaller than the data you’d like to ingest. The smaller file should contain the same headers and about 1,000 rows. Once the data stream is created, larger files can be placed within the file path, maintaining that they fall within set limits.
          Optimal size per CSV file
          • 500 MB
            Sometimes, particularly for Day 0 loads, it’s necessary to move large data volumes. The two parameters maximum file size and maximum files per scheduled run aren’t intended to be stretched to their respective maximums concurrently. Rather consider some tradeoffs between the two extremes and refer to the other limit guidelines for more context.
          Optimal size per Parquet file
          • 50 GB
           
          Maximum number of files per scheduled run
          • 1,000
           
          Maximum size per uncompressed CSV file
          • 200 GB
             
          Maximum size per compressed CSV file
          • 250 MB
            Compressed CSV files aren’t splittable, meaning that they don't allow for parallel processing and their usage is therefore discouraged. If sending in a compressed file, it must be a single file per zip.
          Maximum size per uncompressed parquet file
          • 200 GB
            We recommend you to use the parquet compression capabilities to reduce file size.
          Maximum size per compressed parquet file
          • 100 GB
            Compressed parquet files are splittable and support parallel processing of ingestion. There are several native compression options. Learn more about supported compression formats in Supported File Formats in Data 360.
          Maximum data size for upsert
          • 50 GB
           

          When a data stream is on upsert, it’s best not to exceed a 50-GB aggregate across all files in the single run. This number is strongly influenced by the number of distinct data dates present in the table, the overall size of the table, and several other factors such as the number of columns. As these variable inputs increase you may experience performance degradation.

          For larger target tables and larger upsert ingestions, we strongly recommend using engagement tables and selecting an engagement date field that contains many distinct dates. Not using engagement nor selecting a high cardinality date field may cause performance degradation.

          Maximum data size for full refresh
          • 1,000 GB
            Data size is the sum of all file sizes for a single datastream job. When a data stream is set for full refresh, it’s best not to exceed 1000 GB for all files in a single run. This number is strongly influenced by the number of distinct data dates present in the table, the overall size of the table, and several other factors such as the number of columns. As these input variables increase you may experience performance degradation.
          Maximum number of characters for a single cell
          • 70,000
          Yes  

          Local File Upload

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Maximum number of characters for a single cell in a local file upload
          • 70,000
          Yes  
          Supported delimiters
          • We currently only support the comma (,) delimiter.
          Yes  
          Header row requirements
          • CSV files require a header row name for every field. This requirement is the same format as what's used for Cloud Drive uploads.
          Yes  
          Unsupported refresh of data streams
          • A data stream created from a local file upload can't be automatically refreshed. You can upload a new file to refresh your data.
          Yes  
          Maximum number of data streams created from a local file upload
          • 100
          Yes  
          Unsupported file or column names
          • File or column names with multi-byte characters aren't supported.
          Yes  
          Primary key uniqueness restrictions
          • Primary key uniqueness isn’t enforced for local CSV file uploads.
          We recommend that your primary key column is unique before uploading a local file.

          SFTP Ingestion

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Maximum size per file
          • 2 GB
          Yes The SFTP Connector can load up to 1,000 files that are less than 2 GB but totaling no more than 30 GB per data stream run. If you have more data, you can use upsert mode for consecutive runs.
          Maximum number of files included in a scheduled run
          • 1,000
          Yes
          Maximum data per data stream run
          • 30 GB
          Yes
          Maximum size limit for file allowed during field analysis
          • 4.1 MB
          Yes  
          Maximum file for PGP-encrypted schema file analysis
          • 25 MB
          Yes  
          Maximum object size for fields
          • 25 MB
          Yes  
          Maximum total size allowed for manual extract
          • 4.5 GB
             
          Maximum total number of files allowed for manual extract
          • 1,000
          Yes  

          Data Model Object Guidelines and Limits

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Fields per data model object
          • 800 per field type
          • 1,050 total fields
          Yes Both custom and system fields are counted toward this limit.
          Maximum number of data model objects
          • 7,500
          Yes  
          Maximum number of custom relationships per data model object
          • 25
          Yes  

          Data Shares Guidelines and Limits

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Total number of data shares per org
          • 100
          Yes  
          Total number of data share targets per org
          • 20
          Yes This limit isn’t applicable for Salesforce Data 360 data share targets.
          Maximum number of objects included per data share
          • 250
          Yes  

          Data Transforms Guidelines and Limits

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Total number of streaming data transforms per org
          • 25
          Yes  
          Maximum number of characters in a streaming data transform SQL statement
          • 20,000
          Yes  
          Maximum number of batch transforms that can be created in an org
          • 1,000
          Yes Contact your Account Executive to adjust this limit.
          Maximum runtime (duration) of a batch transform job before it’s canceled by the system
          • 24 hours
          Yes  
          Maximum concurrent batch transforms that can be run at a time
          • 50
          Yes  

          Document AI Guidelines and Limits

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Maximum size of unstructured data files for Document AI processing—PDF, JPEG, JPG, and PNG
          • 10 MB
          Yes LLM Token consumption varies based on the data density and complexity in input file and the generated output. These limits apply to the UI and API for extracting data and schemas.
          Maximum number of pages per PDF for Document AI processing
          • 25
          Yes  
          Maximum number of requests per minute for schema extraction
          • 20
          Yes  
          Maximum number of fields the root object and every table can have when creating a document schema config
          • 50
          Yes  
          Maximum number of fields in a schema tree when calling the data extraction API with an inline schema
          • 100
          Yes  

          AI Model Builder Guidelines and Limits

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Connected Models (Bring Your Own Model (BYOM)): Maximum number of input variables for an AI model per tenant
          • 100
          Yes  
          AI-Created Models (Created From Scratch): Maximum number of input variables for an AI model per tenant
          • 50
          Yes  
          AI-Created Models (Created From Scratch): Numeric variables automatically categorized into groups
          • 10 (numeric fields are grouped into these ranges based on their distribution)
          Yes If there are less than 10 unique numeric fields, the numbers won't be grouped.
          AI-Created Models (Created From Scratch): Number of unique categories per model variable
          • 100
          Yes

          If a text variable has more than 100 unique categories, the top 100 (ranked by occurrence) will remain unique. Any remaining categories are grouped into an "Other" category. Null values in Date, Date Time, and Text variables are grouped into a separate "Null" category.

          Missing numeric values are placed in an "Unspecified" category.

          While the category limit is 100 based on data provided, it's possible to have 102 categories.

          AI-Created Models (Created From Scratch): Minimum number of occurrences required to consider a single text value as unique for model training
          • 25
          Yes  
          AI-Created Models (Created From Scratch): Minimum number of rows
          • 400
          Yes  
          AI-Created Models (Created From Scratch): Maximum number of rows
          • 20 million
          Yes For the XGBoost algorithm, we automatically use a representative sample of only 5 million rows from the 20 million provided.
          AI-Created Models (Created From Scratch): Maximum number of concurrent model training jobs per org
          • 2
            Any additional model training jobs remain queued until a job opens up.
          All Predictive Models: Maximum number of active prediction jobs per org
          • 100
             
          All Predictive Models: Maximum number of active models
          • 2000
             
          All Predictive Models: Maximum number of rows for each inference request to an external provider
          • 200
             

          Identity Resolution Guidelines and Limits

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Total number of identity resolution rulesets
          • 5 per primary data model object per data space
          Yes

          This limit is enforced in the UI.

          All profile records processed by all active rulesets are summed when calculating billable usage. To avoid excess credit consumption, we recommend deleting unneeded rulesets.

          Maximum scheduled job frequency per ruleset
          • Once per day
          Yes Because the time of day can vary, ruleset jobs can occasionally run more than once in a 24-hour period. Scheduled ruleset jobs are skipped if there are no changes to source data, object mappings, or ruleset configurations. This limit is enforced automatically and isn't configurable.
          Maximum number of ruleset jobs in any 24-hour period
          • 4 per ruleset per data space
          Yes Ruleset jobs don’t run if there are no changes to source data, object mappings, or ruleset configurations, or if you’ve already run 4 ruleset jobs in the last 24 hours.
          Maximum number of source profiles that can be unified into a single unified profile
          • 50,000
          Yes Large profiles usually result from poor data quality or match rules that are too broad. To reduce the number of matching records, clean your data or refine your match rules.
          Maximum size of source records processed in a ruleset
          • 15 KB
          Yes Records that are larger than 15 KB are skipped when rulesets run.
          Maximum number of match rules per ruleset
          • 10
          Yes  
          Maximum number of match criteria per match rule
          • 10
          Yes  
          Maximum combined character count of values reviewed by a match rule
          • 500 characters
          Yes If the sum of characters of values being matched by a single match rule is greater than 500, some values will be truncated during matching. This can result in incorrect matches.
          Match methods used during real-time matching
          • Exact and Exact Normalized match only
          Yes All match rules are evaluated using the Exact or Exact Normalized match methods during real-time matching. The select match method is used during scheduled matching.
          Maximum number of individual profile identifiers included in a unified profile
          • 75
          Yes

          The source profile identifiers that can be included in a real-time data graph are further limited by data source.

          • Include no more than 25 profile records that include source records ingested by website and mobile connections.
          • Include no more than 50 other source profiles.
          Maximum number of engagement events included in a unified profile
          • 100
          Yes See also Scheduled and Real-Time Matching in Identity Resolution

          Intelligent Context Guidelines and Limits

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Maximum number of PDF files that can be uploaded
          • 5
          Yes  
          Maximum size of PDF files that can be uploaded
          • 10 MB
          Yes  

          Private Connect for Guidelines and Limits

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Maximum number of private network routes
          • 1
          Yes Additional private network routes are available for purchase.

          Real-Time Insights Guidelines and Limits

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Maximum number of real-time insights
          • 20
          Yes Real-time insights can only be built from real-time data graphs.
          Maximum number of real-time insight fields (dimensions and measures)
          • 15
          Yes
          • This limit is for the total number of dimensions and measures in the real-time insight.
          • Real-time insights automatically include the primary key from the real-time data graph they’re built on as a dimension. The primary key can’t be removed.
          • All dimensions and measures used in the real-time data graph must be added to the real-time data graph first.
          Maximum number of related-DMOs
          • 10
          Yes
          • Only DMOs that have a many-to-one relationship with an existing real-time Data Graph Engagement DMO.
          • The DMO must have a direct relationship to the real-time Data Graph Engagement DMO.
          • The maximum is for all active real-time insights across all data spaces.

          Segmentation Guidelines and Limits

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Total number of active segments
          • 9,950
          Yes An active segment is a created segment with all functionality available. These limits are enforced by the UI. Learn more in Create a Segment in Data 360 and Segment Types and Statuses.
          Total number of attributes with value suggestions
          • 500
          Yes Value suggestions can be enabled for up to 500 attributes for your entire org. Learn more in Use Value Suggestions in Segmentation.
          Total number of standard concurrent publishes
          • 50
          Yes  
          Total number of filters
          • 100
          Yes You can have up to 50 filters each in the Include and Exclude tabs. Learn more in Segment Your Data in Attributes.
          Total number of months that events are queried for standard segments
          • 24
          Yes The segmentation event limit is 24 months or less. This limit is enforced by the UI. For scheduled and manual publishes, segments error if a date range goes past the 24-month range. To run the segment again, adjust your event dates to fit that range.
          Maximum number of scheduled publishes per standard segment per day
          • 2
          Yes This limit is enforced by the UI. Learn more in Publish a Segment in Data 360.
          Maximum number of rapid segments
          • 20
             
          Total number of rapid concurrent publishes
          • 20
             
          Total number of days that events are queried for rapid segments
          • 7
            The segmentation event limit is 7 days. This limit is enforced by the UI. For scheduled and manual publishes, segments error if a date range goes past the 7-day range. To run the segment again, adjust your event dates to fit that range.
          Maximum number of scheduled publishes per rapid segment per day
          • 24
             
          Maximum number of real-time segments
          • 35
             
          Maximum number of engagement data model objects included in a real-time segment
          • 1
          Yes  
          Maximum number of streaming events included in a real-time segment
          • 1
          Yes  
          Maximum levels of nested segments in a real-time segment
          • 1
          Yes  
          Exclusion in a real-time segment
          • Not available
          Yes  
          Total number of processed records for rapid segments
          • 600 million
          Yes This limit is for records processed for a rapid segment, which is the sum of records present in the data streams associated with the segment. This limit isn’t associated with the segment population.
          Maximum number of Einstein segments created per month
          • 160
            The default limit is 20.
          Maximum number of waterfall segments
          • 20
            This limit is enforced by the UI. Learn more in Create a Waterfall Segment.

          Streaming Insights Guidelines and Limits

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Aggregation time window
          • Minimum: 1 minute
          • Maximum: 24 hours
          Yes Learn more about the differences between Streaming Insights and Calculated Insights.
          Total number of dimensions
          • 10
          Yes For Streaming Insights, window start and window end are required dimensions and are included in the total dimension limit.
          Total number of measures
          • 5
          Yes Measures can use Count and Sum functions only.
          Total number of Streaming Insights
          • 20
          Yes  
          Unsupported features
          • Streaming insights aren’t available in segmentation or activation
          Yes  
          Supported primary DMO objects
          • DMOs with specific streaming sources
          • Streaming DMO must be Engagement type. A Streaming DMO of type Other or Profile isn’t allowed.
           

          Primary objects are filtered DMOs created from any of these streaming connectors:

          • Web SDK
          • Mobile SDK
          • Marketing Cloud Personalization
          • Ingestion API (Streaming)
          • Ingestion API (Batch)
          • Amazon Kinesis
          • Apache Kafka
          • CRM Streaming
          Allowed joins
          • DMO must be an inner join of Engagement type. A DMO of type Other or Profile isn’t allowed.
           

          During the identity resolution process, unified objects from the Individual DMO include these objects:

          • Unified Individual
          • Individual Link
          • Contact Point

          Joins on objects unrelated to Individual or objects related to Account aren't allowed.

          Unstructured Data and Search Index Guidelines and Limits

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Maximum size of unstructured data files - TXT or HTML formats
          • 4 MB
          Yes Files larger than 4 MB are added to unstructured data lake objects and unstructured data model objects, but they aren’t chunked or vectorized.
          Maximum size of unstructured data files - PDF, Excel, PowerPoint, Word, RTF, CSV, XML, Email, and Message formats
          • 100 MB
          Yes Files larger than 100 MB are added to unstructured data lake objects and unstructured data model objects, but they aren’t chunked or vectorized.
          Maximum size of unstructured data files - audio and video
          • File length: 1 hour
          • File size: 1 GB
          Yes  
          Maximum size of unstructured data files - image formats
          • 20 MB
          Yes  
          Maximum number of search indexes per Data 360 instance
          • 25
          Yes  
          Maximum number of fields that can be selected for pre-filtering
          • 10
          Yes With pre-filtering you can enhance a search index configuration by adding fields from an object or related objects to give users more ways to filter their searches.
          Maximum query filter length in a hybrid search query
          • 128,000 characters
          Yes  
          Maximum query filter clauses in a hybrid search query
          • 3,072
          Yes  
          Maximum number of terms in the IN clause in a hybrid search query
          • 4,096
          Yes  
          Maximum number of results for a hybrid search query
          • 2048
          Yes  
          Maximum number of results for a vector search query
          • 16,000
          Yes  
          Maximum amount of unstructured data that can be added to an enriched index
          • 3 GB per day
             
          Search Index processing time when enriched indexing is enabled
          • Within 24 hours
            Processing time varies based on LLM processing. Datasets smaller than 200 MB are processed within 2 hours.

          API Guidelines and Limits

          The concurrent request and query limits of Data 360 APIs are independent from the concurrency limits governing the Salesforce platform APIs.

          API Queries

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Total number of rows queried at a time in Query Editor
          • 1,000
          Yes  
          Maximum timeout
          • 5 minutes
          Yes The HTTP 408 status code indicates a timeout.
          Total number of queries per day for Query API and Insights API See Data Services Billable Usage Types for Data 360   Applies to Query API v1 and v2.
          Total number of queries per day for Profile API See Data Services Billable Usage Types for Data 360    
          Total number of queries per month for Profile API See Data Services Billable Usage Types for Data 360    

          Ingest Bulk API Limits

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          API usage limits
          • Varies
          Yes After each request, your app must check the response code. The HTTP 429 Too Many Requests status code indicates that the app must reduce its request frequency.
          Bulk job retention time
          • 7 days
          Yes Open bulk jobs with the status of Open or Upload Complete that are older than 7 days are deleted from the ingestion queue.
          Maximum number of files per job
          • 100 files
             
          Maximum payload size
          • 150 MB
          Yes Referring to the CSV files uploaded via Bulk API.
          Number of requests or jobs allowed per hour
          • 20
             
          Number of concurrent jobs per connection per object
          • 1
            A job must be associated with a specific connection, and every connection is associated with a specific schema that defines object(s). A connector can support multiple connections.
          Maximum number of concurrent jobs per org
          • 5
             

          Ingest Streaming API Limits

          These limits and guidelines are for streaming ingestion across Mobile and Web SDK and Ingestion API.

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          API usage limits
          • Varies
          Yes After each request, your app must check the response code. The HTTP 429 Too Many Requests status code indicates that the app must reduce its request frequency.
          Expected latency for Ingestion API
          • Non-real time: 2-3 minutes
          • Real-time: 300 milliseconds
          Yes

          Data is processed asynchronously approximately every 3 minutes.

          The latency means that received records are processed every two minutes, but depending on the data volume and how busy the scheduler is, it can take additional time until the data is committed to storage and available for consumption.

          Expected latency for Mobile and Web SDK applications
          • Profile data: 2-3 minutes
          • Engagement data: 2-3 minutes
          • Real-time for Profile and Engagement data: 300 milliseconds
          Yes To reduce processing power and bandwidth requirements, mobile events are queued on the device.
          Maximum number of records that can be deleted via Streaming API deletion
          • 200 records
          Yes  
          Maximum payload size per request
          • 200 KB
          Yes JSON data uploaded via Streaming API have a maximum body size of 200-KB per request. The HTTP 403 Forbidden status code indicates that the API request exceeded the 200-KB limit.
          Total number of requests per second across all Ingestion API object endpoints
          • 250 requests
             

          Profile API Limits

          Feature or Function Guidelines and Limits Hard Limit? Additional Information and Resources
          Maximum number of records returned per call
          • 49,999
          Yes  
          Total number of fields per record
          • 50
          Yes  
          Note
          Note Unified Profile objects that are filtered on the index column have accelerated response rates. Other Profile category objects can’t guarantee a faster response.
          .
           
          Loading
          Salesforce Help | Article