Job flow defines a sequence to execute a set of jobs added in it. All the jobs can be unconnected, to be triggered all at once or connected to each other using links, with each link having a condition defined based on which the subsequent job should be triggered. The job flow starts execution from the job which do not have any incoming links. If there are multiple jobs in job flow which do not have any incoming links then all of them will be started in parallel. If there are any subsequent jobs linked to these jobs then, it will be triggered based on the satisfaction of the condition defined on the link. The details of configuring jobs and links is detailed in following pages.

Working with link in Job Flow

Working with Data Flow Instance Job

Working with DB Command Job

Working with Task Command Job

Working with File Watcher Job

Working with DFS Command Job

Working with Email Notification Job

Working with File Transfer Job

Working with Web Services Job

Working with Loop Job

Working with MongoDB Job

Working with Google Publisher Job

Working with Office 365 Job

Working with Kafka Producer Job

Job flow also provides option to create parameters which can be used across jobs and conditions to be able to use these values during runtime.

Working with Job Flow parameter file

Working with Job Flow parameters

Working with Job Flow SQL Parameters

Working with runtime status and statistics