You can manually start a dataflow job to load the data into datasets immediately. You can also stop the job while it’s running. You can run a maximum of 24
dataflow jobs during a rolling 24-hour period.
|Available in: Salesforce Classic and Lightning Experience|
|Available in: Developer Edition|
|Available for an extra cost in: Enterprise, Performance, and Unlimited Editions|
|To start a dataflow job:||“Edit Wave Analytics Dataflows”|
By default, the dataflow doesn’t run automatically. To start running the dataflow on the schedule, you must manually start the dataflow first. After the first job runs, the dataflow job runs on the daily schedule.
In Wave Analytics, click the gear icon () and then click Data Monitor to open the data monitor.
The Jobs view of the data monitor appears by default.
Select Dataflow View.
Click Start in the actions list (1) to start the dataflow job.
The dataflow job is added to the job queue. The Start button is greyed out while the dataflow job runs.
After the job completes, Wave Analytics sends an email notification to the user who last modified the dataflow definition file.
The email notification indicates whether the job completed successfully. It also shows job details like start time, end time, duration, and number of processed rows. If the job failed, the notification shows the reason for the failure.
You can monitor the dataflow job in the data monitor to determine when dataflow completes. After the dataflow completes successfully, refresh the Home page to view the registered datasets.