A Data Flow is an ELT mapping that represents the movement of data between source(s) and target(s). These source and target definitions can be linked directly to each other or have intermediate connected transformation objects which define the changes in data based on the business rules.
It is mandatory that a data flow has at least one source and a target object. These objects are called source instance transform and target instance transform. These transforms can be be configured to define how data should be read from the source and how it should be loaded in the target.
Diyotta provides a standard set of transformations to apply the business rules on source data represented by source instance transform. Each transformation has a specific functionality and can be configured to perform the required logic. These transformations are as below.
For more information on creating a data flow, refer Creating New Data Flow.
For more information on data flow transformations, refer Working with Data Flow Transforms.
When creating a data flow it is required to select a native type for processing the data. The native type can be generic or a processing database.
Generic: Generic native type data flow only allows moving data directly from source system to target system without any transformation in between. You can add only source instance transform and target instance transform in generic native type data flow. No transformations are allowed to be added in this type of data flow.
Processing database: If a processing database is selected when creating the data flow then, you can add transforms between source and target instance to apply the business logic on the data from the source system and load to target system. Processing database can be used even if the data needs to be moved from source to target system without applying any transformation.
Data flow also provides option to create parameters which can be used to dynamically assign values during runtime. For more details refer below pages.