You are here:
Technical Vendors in Intelligence
Intelligence supports these technical vendors:
Technical Vendors
| VENDOR | DETAILS |
|---|---|
| Amazon Web Services Redshift | Port - 5439 jdbc:postgresql://HOST:PORT/DATABASE?user=USER&password=PASSWORD |
| Amazon Web Services S3 | S3 Bucket Name – The highest hierarchy in S3 Directory – The folder, inside the bucket, from which you want to retrieve files. |
| ATARA | Host/Port - Enter host, port should be 22 User - enter username and password Use Private Key Authentication - If you are using private key authentication, enter it here. Directory - Enter the relevant directory with the location of the file. |
| Box | Credentials - Add your Box.com credentials Directory - Enter the relevant directory with the location of the file. Note that when using the Box.com the directory should include the folders' ID's and not the folder names. Folder ID's can be retrieved from the URL in the provider's platform. |
| Cloudera IMPALA | jdbc:hive2://HOST:PORT/SCHEMA |
| Dropbox | Credentials - Add your Dropbox credentials Directory - Enter the relevant directory with the location of the file. Team Space - Select to retrieve files from team space |
| ftp: | Host/Port Connection Mode – Set according to your settings Credentials (User/Pass) - Enter your username and password Directory – The folder in which the relevant file is located Security Mode – Set according to your settings |
| Google Big Query | To connect to Google Big Query, you must have the default full owner access to the whole project. |
| Google Drive | Credentials - Add your Google credentials Directory - Enter the relevant directory with the location of the file. |
| Google Storage | Credentials - Add your credentials (identical to the Google Big Query credentials) Bucket Name The highest hierarchy in Google Storage Directory - The file location. |
| GreenPlum | Port 5432 jdbc:postgresql://HOST:PORT/DATABASE?user=USER&password=PASSWORD |
| Hadoop HDFS | Host/Port - Enter both host and port HDFS User - Enter username only, no password required. HDFS Directory |
| Hive | jdbc:hive2://HOST:PORT/SCHEMA?user=USER&password=PASSWORD |
| HP VERTICA | Port - 5433 jdbc:vertica://HOST:PORT/DATABASE?user=USER&password=PASSWORD |
| http:// | File Extension – In case the URL does not lead to a specific file, enter the file extension so that the system will know how to process the file. Authentication Type – None means there is no need for authentication. Basic authentication type will require you to enter user and password. Headers - POST Body (Optional) – You can add further information to the request |
| Microsoft Azure Blob Storage | Connection String - Insert the connection details which can be found in the 'Access Keys' tab. For example: Container - Enter the relevant container where your blobs (files) can be found |
| Microsoft SQL Server | Port - 1433 jdbc:sqlserver://HOST:PORT;database=DATABASE;user=USER;password=PASSWORD |
| MongoDB | mongodb://USER:PASSWORD@HOST:PORT/DATABASE.COLLECTION Mongo DB expects to receive a basic JSON file, without an array of nested values. If the JSON file you upload is structured differently than the default structure, you may need to make changes in the transformers to ensure MCI can digest the file properly. |
| OneDrive for Business | Credentials - Add your credentials. Directory - Enter the relevant directory with the location of the file. For example, if the path is Files > Shared > Mapping Tables, use: /Shared/Mapping Tables Note: we do not support a connection in the OneDrive connector with a folder used for integration with other Microsoft services - for example SharePoint. We do however support connections with Shared folders - so if possible, move the files to a shared folder, enabling Datorama to retrieve them with the proper path. |
| OneDrive Personal | Credentials - Add your credentials. Directory - Enter the relevant directory with the location of the file. For example, if the path is Files > Shared > Mapping Tables, use: /Shared/Mapping Tables To link your OneDrive account to SharePoint, click Use Sharepoint and entering your SharePoint site name. Microsoft Graph API permissions must be enabled in your SharePoint account. Note: We support connections with Shared folders - so if possible, move the files to a shared folder, enabling Marketing Cloud Intelligence to retrieve them with the proper path. |
| Oracle | Port - 1521 jdbc:oracle:thin:USER/PASSWORD@HOST:PORT:SID |
| Presto | jdbc:presto://HOST:PORT/DATABASE |
| rss: | File Extension – In case the URL does not lead to a specific file, enter the file extension so that the system will know how to process the file. Authentication Type – None means there is no need for authentication. Basic authentication type will require you to enter user and password. Headers POST Body (Optional) – You can add further information to the request, for example, parameters |
| SAP HANA | port - 30015 jdbc:sap://HOST:PORT/DATABASE?user=USER&password=PASSWORD |
| SmartSheet | Credentials: Add your SmartSheet Credentials Folder/Workspace ID: Enter either the Folder ID or the Workspace ID. Leaving this box empty will retrieve data from the 'Sheets' root folder |
| SnowFlake | Connection String: Credentials: Enter your User Name and Password |
| Treasure Data | jdbc:presto://HOST:PORT/DATABASE |
| Treasure Data Tank | jdbc:presto://HOST:PORT/DATABASE |

