Loading

CRM Analytics dataflow warnings and errors

Udgivelsesdato: Apr 30, 2026
Beskrivelse
A CRM Analytics dataflow job may result in a warning or an error message.

A warning indicates that the dataflow has completed and although all of its datasets have been registered successfully, there may be a resulting data quality issue. Expanding the dataflow job in the Data Manager user interface will reveal which nodes are responsible for the warning message. Administrators with intimate knowledge of both the dataflow structure and the source data will usually be required to investigate the root cause of the warning. Note that in the event that the warning is occurring on a templated app, it could be the case that the app is digesting a feature that is not in use in the impacted environment.

An error message occurs when a dataflow has failed and none of its datasets are registered. In the event that a dataflow has failed, re-running it can overcome a transient issue. If it is a persistent failure, then the root cause will need to be investigated more deeply by an Administrator and should a resolution not be found, then it may be necessary to engage Salesforce Support.

Below are the various errors and warnings that may occur, with links to the individual articles documenting potential resolutions. For more detailed information on each message and potential resolutions, please click on the appropriate link.
Løsning

CRM Analytics dataflow warnings indicate a completed run with potential data quality issues, while errors indicate a full failure where no datasets are registered. The sections below describe each message, its cause, and steps to resolve it.

Warnings

The dataflow was completed, but augment node performed a LookupMultiValue augment operation on a date field, which can give unexpected results
This message occurs when a date field is in the "right_select" of an augment transformation of a "LookupMultiValue" operation. This can produce undesired date values in the resulting datasets.

The dataflow was completed, but the node didn't augment any columns
In an augment transformation, this warning is produced when the key from the left dataset has no matches to the key on the right dataset. This means that no information from the right dataset is included in any of the rows.

The sharing source and security predicate in this dataset version must be the same as in the dataflow
This warning is received when the Sharing Source in a dataflow register node do not match those that are defined in the target dataset's edit page.


Errors

Can't find dimension in update dataset
A key used in an augment transformation does not exist or its values are entirely null.

Can't propagate sharing rules through the augment node
The left and right sources in an augment transformation contain an sfdcDigest of an object in their lineages, and then that object is used as a Sharing Source for a register node.

Error accessing replicated dataset.: Invalid extract field name
In environments with Data Sync enabled, an sfdcDigest node contains a field that is not present in the Connected Dataset.

Dataflow Error: Error during local fetch: Replicated dataset was not found
In environments with Data Sync enabled, an sfdcDigest node reference an object with no corresponding Connected Dataset. This can also occur during an org migration, such as a migration to Hyperforce, while the dataset is being prepared for Analytics dashboards.

This error can occur if your org has been migrated to Hyperforce. If it has been more than 24 hours after the migration and your dataset hasn't self-replicated, reach out to support to see if additional steps are needed.

For more information, see Introducing Hyperforce - General Information & FAQ.


Field is not available
One or more fields are not visible to the Analytics Cloud Integration User, either because the field has been deleted or is not visible to the Analytics Cloud Integration User profile.

InternalServerError : Retried more than 15 times / InternalServerError: Retried more than 30 times
The Bulk API job created by Data Sync is too large. Removing text fields with large values from the Data Sync configuration will resolve the error.

Why is my recipe or data sync job running out of time or being killed?

A recipe or data sync job can fail because the integration user can’t authenticate.

If you receive one of these error messages, it typically indicates a temporary lack of server resources. Rerunning the job usually resolves the issue.



NUMBER_OUTSIDE_VALID_RANGE
A modified Analytics Cloud Integration User profile does not contain "View All Data" or read access on a digested object.

Object is not available
The Analytics Cloud Integration User does not have access to an object, often because it is part of a managed package to which the Integration User does not have a license.

Queue wait time exceeded limit
Often a transient error that can be remedied by re-running the dataflow or an authentication problem with the Integration User.

TODO: // Query is too large. Must partition
The SOQL query generated from digesting an object contains too many characters.

Yderligere ressourcer

Run a Dataflow

Vidensartikelnummer

000382354

 
Indlæser
Salesforce Help | Article