You can edit the data flow properties, by selecting the Properties Tab. These properties are applied to all the transforms in the Data Flow.

Note: To view the Properties tab, either click on the empty area of the Data Flow canvas, or click on the Data Flow name in the search field drop-down.

Manage the following properties of the Data Flow from the Properties Tab.

Logging Level: You can choose level at which the logs need to be generated during the execution of the data flow. These logs can be viewed in the Diyotta Monitor.

The logging options are:

Logging LevelDescription

ERROR

Designates error events.

WARN

Designates potentially harmful situations.

INFO

Designates informational messages that highlight the progress of the application at coarse-grained level.

DEBUG

Designates fine-grained informational events that are most useful to debug an application. 

TRACE

Designates finer-grained informational events than the DEBUG.

The Log details of all preceding level also gets covered in succeeding level logs. For example, DEBUG level will shows logs inclusive of INFO, WARN and ERROR level details | TRACE level will show logs inclusive of DEBUG, INFO, WARN, and ERROR level details.

TForm Cleanup

Set this option to specify if the temporary tables created during the execution of the data flow should to be dropped or not after the successful execution of Data Flow.

You can specify if you want to delete, or retain the temporary tables created.

  • Yes - If you want to cleanup/delete the temporary tables.
  • No - If you do not want to delete the temporary tables.

Setting the TForm Cleanup to ''Yes'' is useful in development scenario, when the intermediate results of the execution of data flow, having multiple transformations, needs to be analysed to validate the business logic.

JDBC Transaction Control

This option is applicable when loading data into target system using JDBC load type. If the load fails intermittently without inserting all the records then, the partial records that are inserted is rolled back from the target table. If there are multiple pipelines in the data flow and any pipeline fails at any point then, the inserted records in all the target tables are rolled back. The rollback is applicable only for target loads where JDBC is used as load types. The rollback is ignored for targets using bulk load type. 

You can set the JDBC transaction control to Yes or No. 

  • Yes - To rollback the data in case of failure.
  • No - To commit the partial data in case of failure. This is default setting.