You are here:
Snowflake Output Connection
Create a remote connection using the Snowflake output connector to write data from Salesforce Data Pipelines to a Snowflake table. Then, use Sync Out to push raw data from Salesforce Data Pipelines to Snowflake, or a Data Prep recipe output node to push transformed data.
How Salesforce Data Pipelines Output Connectors with Output Nodes Work
Salesforce Data Pipelines output connectors allow you to write the outcome of a recipe to an external system for further analysis, business automation, and storage. After you configure an output connector to the data source, create a recipe using Data Prep. Add an Output node to the recipe, select the Output connection, and choose the Snowflake table name from the list of objects. When the recipe runs, Salesforce Data Pipelines writes the output dataset to the selected table. When you run the recipe again, the data previously written is deleted and the new data is written.Generate your Snowflake private key and private key passphrase using the Snowflake Private Key documentation. If you opt to rotate your Snowflake private key, manually update the connection properties with the new key.
Connect to Snowflake with OAuth
To use the Salesforce Data Pipelines Snowflake connector with an OAuth connection, you must configure Snowflake, Salesforce, Salesforce Data Pipelines, and an external OAuth authorization server.
These high-level instructions help you navigate the steps involved. Remember to contact your Network Security or IT department for help with configuration consistent with your organization’s security requirements.
- Configure Snowflake and your selected external authorization server. Here’s Snowflake's help, with detailed instructions for connecting to services like Okta and Microsoft Azure AD.
- With the authorization server is configured, follow the steps under Define an
Authentication Provider in Salesforce from Configure an Authentication Provider
Using OpenID.
- If you’re using Okta, here are Okta's instructions for adding their service to Salesforce.
- With the authentication provider added to Salesforce, define a named credential in Salesforce. Select the OAuth 2.0 authentication protocol and Named Principal identity type. Use of External Credentials isn't supported.
- Add the Snowflake connection. For Authentication Type setting, enter OAuth.
Enable and Add the Snowflake Output Connector
- From Setup, enter Analytics in the Quick Find box.
- Select Settings under Analytics.
- Select Enable Snowflake output Connection and Save.
- In the Data Manager, click the Connections tab.
- Click New Connection.
- Click Output, select Snowflake Output Connector, then click Next.
- Enter the connection settings, as described in the Connection Settings section.
- Click Save & Test. Save & Test validates your settings by attempting to connect to the source. If the connection fails, Salesforce Data Pipelines shows possible reasons.
All settings require a value, unless otherwise indicated.
| Connection Setting | Description |
|---|---|
| Connection Name | Identifies the connection. Use a convention that lets you easily distinguish between different connections. |
| Developer Name | API name for the connection. This name can’t include spaces. You can’t change the developer name after you create the connection. |
| Description | Description |
| Authentication Type | The type of authentication used for this connection. Accepted values are "OAuth", "Password", or "PrivateKey".
|
| Named Credential | The Name field from a named credential stored in your Salesforce org. |
| Username | User name for the Snowflake account. |
| Password | Optional setting*. Password for your Snowflake account. |
| Account | Name of your Snowflake account. The account name is the first segment in the domain in your Snowflake URL. For example, 123abc is your account name in https://123abc.snowflakecomputing.com. |
| Warehouse | Snowflake warehouse name. This setting is case-sensitive, so enter the value exactly as it appears in Snowflake. |
| Role | Optional setting. Snowflake role assigned to the user that you’re using to connect. |
| Database | Snowflake database name. This setting is case-sensitive, so enter the value exactly as it appears in Snowflake. |
| Schema | Snowflake schema name. This setting is case-sensitive, so enter the value exactly as it appears in Snowflake. |
| Private Key | Optional setting*. A private key associated with your Snowflake account. Note You must use an encrypted private key and password generated with the Advanced Encryption
Standard (AES). For the detailed steps, refer to Key-pair authentication and key-pair rotation in Snowflake Help. When using
the openssl command to generate the encrypted key, be sure to replace des3 with
aes256 to ensure advanced encryption is used. |
| Private Key Passphrase | Optional setting*. The passphrase associated with your specified private key. |
*Enter the Password or both the Private Key and Private Key Passphrase. Learn more about private keys in the Snowflake Private Key documentation.
Push Data to Snowflake
With the Snowflake Output Connector configured, you have two options to push data to Snowflake from Salesforce Data Pipelines. To push augmented and transformed data, build a Data Prep recipe that merges and transforms the data to push to Snowflake. Add an output node and configure it to use the Snowflake Output connector.
- Select to write to an Output Connection.
- Select the connection name of the Snowflake Output connection you created.
- Select Apply.
- Save the recipe.
To push raw data, without augmentation or transformation, use Sync Out for Snowflake. Data is pushed with each Data Sync run. You don’t use a Data Prep recipe. Keep these behaviors in mind when working with the Snowflake output connector and using a Data Prep recipe output node.
- You can use an output connection more than once per recipe, but each output node must use a different object. The connector's per-run limit applies to each output node, and each output node is subject to the rolling 24-hour limit. To push again from the same recipe, add another connection with the same credentials
- Output connections are only available for recipes built with Data Prep.
- When the prior run’s data is deleted in preparation for the current run, the earlier version of an output dataset is inaccessible. Set up a process to copy or use the output after each run.

