Loading

Repopulate placeholder datasets created from deployment or sandbox refresh in CRM Analytics

Дата публикации: Feb 15, 2024
Описание

Best Practices for migrating CRM Analytics Datasets in Packages or Change Sets and refreshing Full(Template), Partial, or Dev Sandboxes


Root Cause of related issues: Packages and Change Sets do not contain data. Full(Template), Partial, and Dev sandbox refreshes do not contain dataset contents. CRM Analytics relies on data to properly function.


When a Dataset is included in a package or change set or copied during a Full(Template), Partial, or Dev sandbox refresh, only the Dataset metadata is included. These placeholder datasets will not appear in the UI and cannot be queried, though they will appear in REST API.
  • In situations where a Dataset was generated as part of a sfdcDigest - transformation - sfdcRegister node series in the Dataflow, simply running the Dataflow in the target environment will repopulate the contents of the Dataset.
  • In situations where a Dataset is generated from CSV or an external data source, the following process will need to be followed to recreate the dataset with the correct alias.
Решение


Recreating CSV and External Datasets after Package/Change Set Deployment or Full(Template)/Partial/Dev Sandbox Refresh


Note: All of the following steps are to be performed in the target environment after deployment or refresh.

Before we begin, here is the intent of this process:
A change set or package deployment or will deploy a placeholder dataset container. In order for the dataset to be available in the UI, it must have contents. To get contents into the dataset, we will upload a new dataset with a different alias and then use a dataflow to overwrite the empty dataset with the replacement data.

1. Create New or Backup Existing Dataflow
  • If you have Data Sync enabled, you can create a new Dataflow for this process.
  • Without Data Sync, you'll need to use an existing Dataflow. Back up the existing Dataflow by selecting the Download option in the Dataflow Editor or Data Manager | Dataflows and Recipes options menu.
     
2. Upload Replacement Dataset
Upload the CSV/external dataset(s). If they are named the same as the deployed dataset(s), you'll note that an incremental numeral is appended to the alias.
  • For CSV datasets: Upload the source CSV and metadata JSON through the Create | Dataset | CSV UI options.
  • For external data sources: Run the job that generates the dataset.

3. Optional: If more than one dataset is impacted, collect impacted dataset Information
If you have more than one impacted dataset, you'll need to edit the dataflow to perform the overwrite for each dataset. We recommend creating a paired list of all impacted Datasets. You'll need the alias of each replacement dataset from step 2 (available in the UI Dataset Edit screen as "API Name") and the alias of each empty dataset from the package/change set or source environment. This may look like the following:
 
Replacement Dataset - Empty Dataset
  Replacement_A   -   Package_Dataset_A
  Replacement_B   -   Package_Dataset_B
 
4. Create Dataflow Definition
In your preferred JSON editor, create a new dataflow definition with an Edgemart - sfdcRegister node pair for each impacted dataset. Update the Node names, aliases, and source as needed. See the code sample below:

"Replacement_A" is the alias of the newly uploaded dataset with the replacement data.
"Package_Dataset_A" is the alias of the empty dataset from the package or change set.
{
"EdgemartReplacementA" :{ 
"action":"edgemart", 
"parameters":{ 
"alias":"Replacement_A" 
} 
}, 
"RegisterDatasetA" :{ 
"action":"sfdcRegister", 
"parameters":{ 
"alias":"Package_Dataset_A", 
"name":"Package Dataset A", 
"source":"EdgemartReplacementA" 
} 
}
}
   Save this dataflow definition.
 
5. Upload and Run Dataflow
Upload the dataflow definition JSON created in step 4 and run it. This will effectively copy the contents of the replacement dataset(s) into the empty dataset(s).
 
6. Verify Dataset Visibility
Verify the contents and visibility of the dataset(s) from the package/change set or refresh. You'll likely want to delete the replacement dataset(s) to avoid unnecessarily impacting your dataset row limit.
 
7. Clean-up
If you used an existing dataflow, restore the backed up dataflow definition and run it. This should now perform normally. It will repopulate the Salesforce-generated datasets and interact with the no-longer-empty CSV/external dataset(s) as expected.
Номер статьи базы знаний

000384102

 
Загрузка
Salesforce Help | Article