Diyotta REST Specifications

Diyotta provides RestFul API service to perform operations in Diyotta without logging into the web user interface.

All the Rest connectivity to Diyotta is through the URL that is used to connect to the web user interface. It is required that all the rest calls to Diyotta be submitted along with valid user authentication key. Diyotta internally used this to decipher against which user the action should be logged and if the user has required privileges to perform the action. The authorization key is generated using Diyotta's dicmd command by following below steps. 

Step I: Decide the Diyotta user id which will be used to perform the action.

Step II: Run the dicmd command with passwd option.

dicmd passwd -e '<UserName>:<UserPassword>' (Example: 'admin:admin')

This will generate the authorization key (Example: M2FThB8QQAsv8fTo9KxhnSmLsjXgI18POI0qCQDUYiE=) which needs to be provided as the header in the Rest call.

It is required that the user used in generating the authorization key has sufficient privileges to perform the operation specified in the Rest call. 

Below are various options available in Rest API to perform different actions in Diyotta.

  • versionThis option is used to view the current version of the Diyotta Controller.

Method URL

<DiyottaURL>/diyotta-rest/version

    • <DiyottaURL>: URL used to connect to Diyotta's web user interface. 

HTTP method

GET

Headers

authorization

Specify the authorization key to authenticate the session created by RestFul API call.

Parameters

None

Privileges

The user requires "Studio Read" privilege for performing this action.

ExampleUse Linux curl command to execute Diyotta Rest API call to view Diyotta Controller version 

$ curl -X GET "https://dev.diyotta.com/diyotta-rest/version" -H "Authorization:M2FThB8QQAsv8fTo9KxhnSmLsjXgI18POI0qCQDUYiE="

{"code":"0","version":"4.1.0.3114.004"}

  • serverstatus: This option is used to view the status of Diyotta controller at any moment. If the Controller is up and running without any issues then, the response message will state "Diyotta Controller is Active". In case of issues the command exists with appropriate error message.

Method URL

<DiyottaURL>/diyotta-rest/serverstatus

    • <DiyottaURL>: URL used to connect to Diyotta's web user interface. 

HTTP method

GET

Headers

authorization

Specify the authorization key to authenticate the session created by RestFul API call.

Parameters

None

Privileges

Not restricted by user specific privileges.

ExampleUse Linux curl command to execute Diyotta Rest API call to check status of the server.

$ curl -X GET "https://dev.diyotta.com/diyotta-rest/serverstatus" -H "Authorization:M2FThB8QQAsv8fTo9KxhnSmLsjXgI18POI0qCQDUYiE="

{"code":"0","Message":"Diyotta Controller is Active "}

  • agentstatus: This option is used to view the status of a specific agent registered with the Controller. Alternately, this option can provide status of all agents registered with the Controller. 

Method URL

<DiyottaURL>/diyotta-rest/agentstatus

    • <DiyottaURL>: URL used to connect to Diyotta's web user interface. 

HTTP method

GET

Headers

authorization

Specify the authorization key to authenticate the session created by RestFul API call.

Parameters

agent (optional)

Name of the agent to check the status. Only one agent name can be specified with this parameter. If this parameter is not provided then, this option prints the status of all the agents registered with the Controller.

Privileges

Not restricted by user specific privileges.

ExampleUse Linux curl command to execute Diyotta Rest API call to check the status of all the agents registered with the Controller.

$ curl -X GET "https://dev.diyotta.com/diyotta-rest/agentstatus" -H "Authorization:M2FThB8QQAsv8fTo9KxhnSmLsjXgI18POI0qCQDUYiE="

Agent Details :
Default : Active
src_agnt_1 : Not Active
tgt_agnt_1 : Not Active

  • jobstatus: This option is used to view the status of the last execution of dataflow and jobflow in a particular project or layer. You can also view the output based on a specific status type. The output is synonymous to the status that can be seen on the Monitor module. 

Method URL

<DiyottaURL>/diyotta-rest/jobstatus

    • <DiyottaURL>: URL used to connect to Diyotta's web user interface. 

HTTP method

GET

Headers

authorization

Specify the authorization key to authenticate the session created by RestFul API call.

Parameters

status (optional) 

Restrict the output based on the execution status. Only one of these values can be specified with this parameter. If this parameter is not provided then, the command will consider all the statuses.

active- Jobs with status as active failed- Jobs with status as failed success- Jobs with status as succeeded
aborted- Jobs with status as aborted stopped- Jobs with status as stopped

project (optional)

Restrict the output to a specific project. Only one project can be specified with this parameter. If this parameter is not provided then, the command will consider all the projects to which the user has access.

layer (optional)

Restrict the output to a specific layer in a project. Only one layer can be specified with this parameter. If this parameter is provided then, it is mandatory to provide the project name also. If this parameter is not passed and project name is specified as parameter then, the command will consider all the layers in the projects to which the user has access.

dataflow (optional)

Restrict the output to a specific data flow in a layer in a project. Only one data flow can be specified with this parameter. If this parameter is provided then, it is mandatory to provide the project name and the layer name to which it belongs. You can either provide data flow name parameter or job flow name parameter for this option. If this parameter and job flow name parameter is not provided and project name and/or layer name is specified as parameter then, the command will consider all the data flows in the project/layer specified.

jobflow (optional) 

Restrict the output to a specific job flow in a layer in a project. Only one job flow can be specified with this parameter. If this parameter is provided then, it is mandatory to provide the project name and the layer name to which it belongs. You can either provide data flow name parameter or job flow name parameter for this option. If this parameter and data flow name parameter is not provided and project name and/or layer name is specified as parameter then, the command will consider all the job flows in the project/layer specified.

runid (optional)

Restrict output to a specific Diyotta generated internal run id associated with the execution. This parameter can be used with any other parameter for this option. Only one runid can be specified with this parameter.

Privileges

The user requires "Monitor Read" privilege for performing this action.

Example: Use Linux curl command to execute Diyotta Rest API call to print all the failed job flows and data flows executed in a layer.

$ curl -X GET "https://dev.diyotta.com/diyotta-rest/jobstatus?status=failed&project=Project_1&layer=Layer_1" -H "Authorization:M2FThB8QQAsv8fTo9KxhnSmLsjXgI18POI0qCQDUYiE="

{"code":0,"jobflow":[],"dataflow":[{"project":"Project_1","layer":"Layer_1","flow":"d_Oracle_to_Hive2","status":"failed","starttime":"2019-05-23 10:33:25","endtime":"2019-05-23 10:33:48","elapsedtime":23000,"runid":13052},{"project":"Project_1","layer":"Layer_1","flow":"df_Oracle_to_Hadoop","status":"failed","starttime":"2019-05-23 10:33:41","endtime":"2019-05-23 10:41:42","elapsedtime":481000,"runid":13053},{"project":"Project_1","layer":"Layer_1","flow":"df_hive_to_hive","status":"failed","starttime":"2019-05-24 04:59:19","endtime":"2019-05-24 05:01:45","elapsedtime":146000,"runid":13054}]}

  • compile: This option is used to recompile the data flows or job flows in a project or layer. 

Method URL

<DiyottaURL>/diyotta-rest/compile

    • <DiyottaURL>: URL used to connect to Diyotta's web user interface. 

HTTP method

POST

Headers

authorization

Specify the authorization key to authenticate the session created by RestFul API call.

Parameters

project

Compile all the data flows and job flows in a specific project. Only one project can be specified with this parameter. It is mandatory to provide this parameter.

layer (optional)

Compile all the data flows and job flows in a specific layer in a project. Only one layer can be specified with this parameter. If this parameter is provided then, it is mandatory to provide the project name also. If this parameter is not passed and project name is specified as parameter then, the command will consider all the layers in the projects to which the user has access.

flow (optional)

Compile a specific data flow or job flow in a layer in a project. Only one data flow or job flow can be specified with this parameter. If this parameter is provided then, it is mandatory to provide the project name and the layer name to which it belongs. You also need to provide the flowtype parameter when this parameter is provided. If this parameter is not provided and project name and/or layer name is specified as parameter then, the command will consider all the data flows and jobflows in the project/layer specified.

flowtype (optional)

Specify if flow name specified in flow parameter is dataflow or jobflow. Allowed values that can be passed with this parameter is dataflow and jobflow. It is mandatory to provide this parameter with the flow parameter. 

Privileges

The user requires "Studio Read" and "Studio Write" privilege for performing this action.

Example: Use Linux curl command to execute Diyotta Rest API call to compile a specific dataflow in a project.

$ curl -X POST "https://dev.diyotta.com/diyotta-rest/compile?project=Project_1&layer=Layer_1&flow=df_hive_to_hive&flowtype=dataflow" -H "Authorization:M2FThB8QQAsv8fTo9KxhnSmLsjXgI18POI0qCQDUYiE="

{"code":0,"dataflow":[{"project":"Project_1","layer":"Layer_1","flow":"df_hive_to_hive","status":"compilation completed successfully","timestamp":"2019-06-27 10:30:00"}]}

  • execute: This option is used to either execute dataflows and jobflows.

Method URL

<DiyottaURL>/diyotta-rest/execute

    • <DiyottaURL>: URL used to connect to Diyotta's web user interface. 

HTTP method

POST

Headers

authorization

Specify the authorization key to authenticate the session created by RestFul API call.

Form Data

paramFile

Specify the parameter file name with fully qualified path to be used as part of job flow execution. If no parameter file needs to be provided then the value for this can be left blank.

Parameters

project

Specify the name of the project to which the data flow or job flow which needs to be executed belongs. It is mandatory to provide this parameter. Only one project name can be specified with this parameter.

layer

Specify the name of the layer to which the data flow or job flow which needs to be executed belongs. It is mandatory to provide this parameter. Only one layer name can be specified with this parameter.

flow

Specify the name of the data flow or job flow that needs to be executed. You can either provide data flow name parameter or job flow name parameter for this option. It is mandatory to provide either data flow name or job flow name. Only one data flow or job flow name can be specified with this parameter.

flowtype

Specify if flow name specified in flow parameter is dataflow or jobflow. Allowed values that can be passed with this parameter is dataflow and jobflow. It is mandatory to provide this parameter with the flow parameter. 

jobnm (optional)

Specify the name of the job within the job flow that needs to be executed. This parameter can be specified only when flow name as job flow name parameter is provided. Only one job name can be specified with this parameter.

paramInput (optional)

Specify the parameters and the value that need to be used to override the default parameter value defined in the data flow or job flow during execution. This parameter allows you to override project parameter, data flow parameter and Job Flow parameter. 

The parameter name and value needs to be encapsulated in single quotes ('') and multiple parameters need to be separated by comma (,).

The parameter names should be suffixed with identifiers based on the type of the parameter being passed with the parameter. The project parameter should be suffixed with $PP_, the data flow parameter should be suffixed with $MP_ and job flow parameter should be suffixed with $FL_.

instanceName (optional)

Specify a name for the instance to be associated with the execution of job flow. This parameter is used to run multiple instances of a job flow in parallel. This parameter can be specified only when job flow name parameter is provided. If no instance name is specified then in the monitor the job flow name will appear as it is. When instance name is specified then the job flow name will appear with [instance name] suffixed to the job flow name. 

email (optional)

Use this parameter to send email notification in case of failure of data flow or job flow being executed. Below are the associated parameters that need to be specified with this parameter.

mailTo (optional)

Specify the email id of the recipient to whom email should be sent. Multiple email ids can be specified as comma separated values. It is mandatory to provide this parameter to send the email. You can use project parameters, job flow parameters and system parameters to specify the value to be passed with this parameter. 

cc (optional)

Specify the email id of the recipient to whom copy of the email should be sent. Multiple email ids can be specified as comma separated values. You can use project parameters, job flow parameters and system parameters to specify the value to be passed with this parameter. 

subject (optional)

Specify the subject of the email to be sent. You can use project parameters, job flow parameters and system parameters to specify the value to be passed with this parameter. 

message (optional)

Specify the message to be included in the the email to be sent. You can use project parameters, job flow parameters and system parameters to specify the value to be passed with this parameter. 

logs (optional)

Use this option if you want monitor logs associated with this failed execution to be attached to the email sent.

Privileges

The user requires "Studio Read" and "Studio Execute" privilege to perform this action.

Example 1: Use Linux curl command to execute Diyotta Rest API call to execute a Job Flow.

$ curl -X POST "https://dev.diyotta.com/diyotta-rest/execute?project=Project_1&layer=Layer_1&flow=jf_oracle_to_hive_fllt&flowtype=jobflow¶m=\$PP_PRD_FLTR_PROJ=15" -F "paramFile=" -H "Authorization:1fNOyd4pAtPTKPE2e6N6heUnHGM3AMUjNFFeO2nzXkw="

{"code":0,"project":"Project_1","layer":"Layer_1","flow":"jf_oracle_to_hive_fllt","flowtype":"dataflow","message":"The flow has been executed","runid":161,"trackingURL":"https://dev.diyotta.com/diservice/cli/jobstatus?runid=161"}

Example 2:Use Linux curl command to execute Diyotta Rest API call to execute a a Job Flow by overriding project parameters used in job flow.

$ curl -X POST "https://dev.diyotta.com/diyotta-rest/execute?project=Project_1&layer=Layer_1&flow=jf_oracle_to_hive_fllt&flowtype=jobflow&param=\$PP_PRD_FLTR_PROJ=250" -F "paramFile=" -H "Authorization:1fNOyd4pAtPTKPE2e6N6heUnHGM3AMUjNFFeO2nzXkw="

{"code":0,"project":"Project_1","layer":"Layer_1","flow":"jf_oracle_to_hive_fllt","flowtype":"dataflow","message":"The flow has been executed","runid":161,"trackingURL":"https://dev.diyotta.com/diservice/cli/jobstatus?runid=161"}

Example 3:Use Linux curl command to execute Diyotta Rest API call to execute an instance of job flow and send notification with email attachment in case of failure.

$ curl -X POST "https://dev.diyotta.com/diyotta-rest/execute?project=Project_1&layer=Layer_1&flow=d_Hive_to_hive_jnr_1&param=\$PP_PRD_FLTR_PROJ=14&flowtype=dataflow&email&mailTo=USER2@diyotta.com&cc=USER3@diyotta.com&subject=\$$DataFlowName&message='statusassucceded'&logs" -F "paramFile=" -H "Authorization:M2FThB8QQAsv8fTo9KxhnSmLsjXgI18POI0qCQDUYiE="

{"code":0,"project":"Project_1","layer":"Layer_1","flow":"d_Hive_to_hive_jnr_1","flowtype":"dataflow","message":"The flow has been executed","runid":13061,"trackingURL":"https://dev.diyotta.com/diservice/cli/jobstatus?runid=13061"}

  • restartfromfailure: This option is used to start executing a failed job flow or data flow from point of failure or prior run

Method URL

<DiyottaURL>/diyotta-rest/restartfromfailure

    • <DiyottaURL>: URL used to connect to Diyotta's web user interface. 

HTTP method

POST

Headers

authorization

Specify the authorization key to authenticate the session created by RestFul API call.

Form Data

paramfile

Specify the parameter file name with fully qualified path to be used as part of job flow execution. If no parameter file needs to be provided then the value for this can be left blank. 

Parameters

project

Specify the name of the project to which the data flow or job flow which needs to be restarted belongs. It is mandatory to provide this parameter. Only one project name can be specified with this parameter.

layer

Specify the name of the layer to which the data flow or job flow which needs to be restarted belongs. It is mandatory to provide this parameter. Only one layer name can be specified with this parameter.

flow

Specify the name of the data flow or job flow that needs to be restarted. You can either provide data flow name parameter or job flow name parameter for this option. It is mandatory to provide data flow name parameter or job flow name parameter. Only one data flow or job flow name can be specified with this parameter.

flowtype

Specify if flow name specified in flow parameter is dataflow or jobflow. Allowed values that can be passed with this parameter is dataflow and jobflow. It is mandatory to provide this parameter with the flow parameter. 

Privileges

The user requires "Studio Read" and "Studio Execute" privilege to perform this action.

Example: Use Linux curl command to execute Diyotta Rest API call to restart a Job Flow from failed point.

$ curl -X POST "https://dev.diyotta.com/diyotta-rest/restartfromfailure?project=Project_1&layer=Layer_1&flow=jf_oracle_to_hive_fllt&flowtype=jobflow" -F "paramFile=" -H "Authorization:1fNOyd4pAtPTKPE2e6N6heUnHGM3AMUjNFFeO2nzXkw="

{"code":0,"project":"Project_1","layer":"Layer_1","flow":"jf_oracle_to_hive_fllt","flowtype":"dataflow","message":"The flow has been executed","runid":161,"trackingURL":"https://dev.diyotta.com/diservice/cli/jobstatus?runid=161"}

  • rerun: This option is used to execute a job flow or data flow with same Diyotta generated internal run id as prior run.

Method URL

<DiyottaURL>/diyotta-rest/rerun

    • <DiyottaURL>: URL used to connect to Diyotta's web user interface. 

HTTP method

POST

Headers

authorization

Specify the authorization key to authenticate the session created by RestFul API call.

Form Data

paramfile

Specify the parameter file name with fully qualified path to be used as part of job flow execution. If no parameter file needs to be provided then the value for this can be left blank.

Parameters

project

Specify the name of the project to which the data flow or job flow which needs to be rerun belongs. It is mandatory to provide this parameter. Only one project name can be specified with this parameter.

layer

Specify the name of the layer to which the data flow or job flow which needs to be rerun belongs. It is mandatory to provide this parameter. Only one layer name can be specified with this parameter.

flow

Specify the name of the data flow or job flow that needs to be rerun. You can either provide data flow name parameter or job flow name parameter for this option. It is mandatory to provide data flow name parameter or job flow name parameter. Only one data flow or job flow name can be specified with this parameter.

flowtype

Specify if flow name specified in flow parameter is dataflow or jobflow. Allowed values that can be passed with this parameter is dataflow and jobflow. It is mandatory to provide this parameter with the flow parameter. 

paramInput (optional)

Specify the parameters and the value that need to be used to override the default parameter value defined in the data flow or job flow during execution. This parameter allows you to override project parameter, data flow parameter and Job Flow parameter. 

The parameter name and value needs to be encapsulated in single quotes ('') and multiple parameters need to be separated by comma (,).

The parameter names should be suffixed with identifiers based on the type of the parameter being passed with the parameter. The project parameter should be suffixed with $PP_, the data flow parameter should be suffixed with $MP_ and job flow parameter should be suffixed with $FL_.

instanceName (optional)

Specify a name for the instance to be associated with the execution of job flow. This parameter is used to run multiple instances of a job flow in parallel. This parameter can be specified only when job flow name parameter is provided. If no instance name is specified then in the monitor the job flow name will appear as it is. When instance name is specified then the job flow name will appear with [instance name] suffixed to the job flow name. 

email (optional)

Use this parameter to send email notification in case of failure of data flow or job flow being executed. Below are the associated parameters that need to be specified with this parameter.

mailTo (optional)

Specify the email id of the recipient to whom email should be sent. Multiple email ids can be specified as comma separated values. It is mandatory to provide this parameter to send the email. You can use project parameters, job flow parameters and system parameters to specify the value to be passed with this parameter. 

cc (optional)

Specify the email id of the recipient to whom copy of the email should be sent. Multiple email ids can be specified as comma separated values. You can use project parameters, job flow parameters and system parameters to specify the value to be passed with this parameter. 

subject (optional)

Specify the subject of the email to be sent. You can use project parameters, job flow parameters and system parameters to specify the value to be passed with this parameter. 

message (optional)

Specify the message to be included in the the email to be sent. You can use project parameters, job flow parameters and system parameters to specify the value to be passed with this parameter. 

logs (optional)

Use this option if you want monitor logs associated with this failed execution to be attached to the email sent.

Privileges

The user requires "Studio Read" and "Studio Execute" privilege to perform this action.

Example: Use Linux curl command to execute Diyotta Rest API call to rerun a Job Flow.

$ curl -X POST "https://dev.diyotta.com/diyotta-rest/rerun?project=Project_1&layer=Layer_1&flow=jf_oracle_to_hive_fllt&flowtype=jobflow" -F "paramFile=" -H "Authorization:1fNOyd4pAtPTKPE2e6N6heUnHGM3AMUjNFFeO2nzXkw="

{"code":0,"project":"Project_1","layer":"Layer_1","flow":"jf_oracle_to_hive_fllt","flowtype":"dataflow","message":"The flow has been executed","runid":161,"trackingURL":"https://dev.diyotta.com/diservice/cli/jobstatus?runid=1611"}

  • abort: This option is used to abort an active job flow or data flow execution

Method URL

<DiyottaURL>/diyotta-rest/abort

    • <DiyottaURL>: URL used to connect to Diyotta's web user interface. 

HTTP method

POST

Headers

authorization

Specify the authorization key to authenticate the session created by RestFul API call.

Parameters

project

Specify the name of the project to which the data flow or job flow which needs to be aborted belongs. It is mandatory to provide this parameter. Only one project name can be specified with this parameter.

layer

Specify the name of the layer to which the data flow or job flow which needs to be aborted belongs. It is mandatory to provide this parameter. Only one layer name can be specified with this parameter.

flow

Specify the name of the data flow or job flow that needs to be aborted. You can either provide data flow name parameter or job flow name parameter for this option. It is mandatory to provide data flow name parameter or job flow name parameter. Only one data flow or job flow name can be specified with this parameter.

flowtype

Specify if flow name specified in flow parameter is dataflow or jobflow. Allowed values that can be passed with this parameter is dataflow and jobflow. It is mandatory to provide this parameter with the flow parameter. 

Privileges

The user requires "Studio Read" and "Studio Execute" privilege to perform this action.

Example: Use Linux curl command to execute Diyotta Rest API call to abort a Job Flow.

$ curl -X POST "https://dev.diyotta.com/diyotta-rest/abort?project=Project_1&layer=Layer_1&flow=jf_oracle_to_hive_fllt&flowtype=jobflow" -H "Authorization:1fNOyd4pAtPTKPE2e6N6heUnHGM3AMUjNFFeO2nzXkw="

{"code":0,"project":"Project_1","layer":"Layer_1","flow":"jf_oracle_to_hive_fllt","flowtype":"","message":"The flow has been aborted","runid":2515,"starttime":"2019-07-05 09:21:10","endtime":"","elapsedtime":0}

  • stop: This option is used to stop an active job flow or data flow after completing execution

Method URL

<DiyottaURL>/diyotta-rest/stop

    • <DiyottaURL>: URL used to connect to Diyotta's web user interface. 

HTTP method

POST

Headers

authorization

Specify the authorization key to authenticate the session created by RestFul API call.

Parameters

project

Specify the name of the project to which the data flow or job flow which needs to be stopped belongs. It is mandatory to provide this parameter. Only one project name can be specified with this parameter.

layer

Specify the name of the layer to which the data flow or job flow which needs to be stopped belongs. It is mandatory to provide this parameter. Only one layer name can be specified with this parameter.

flow

Specify the name of the data flow or job flow that needs to be stopped. You can either provide data flow name parameter or job flow name parameter for this option. It is mandatory to provide data flow name parameter or job flow name parameter. Only one data flow or job flow name can be specified with this parameter.

flowtype

Specify if flow name specified in flow parameter is dataflow or jobflow. Allowed values that can be passed with this parameter is dataflow and jobflow. It is mandatory to provide this parameter with the flow parameter. 

Privileges

The user requires "Studio Read" and "Studio Execute" privilege to perform this action.

Example: Use Linux curl command to execute Diyotta Rest API call to stop a Job Flow.

$ curl -X POST "https://dev.diyotta.com/diyotta-rest/abort?project=Project_1&layer=Layer_1&flow=jf_oracle_to_hive_fllt&flowtype=jobflow" -H "Authorization:1fNOyd4pAtPTKPE2e6N6heUnHGM3AMUjNFFeO2nzXkw="

{"code":0,"project":"Project_1","layer":"Layer_1","flow":"jf_oracle_to_hive_fllt","flowtype":"","message":"The flow has been aborted","runid":2515,"starttime":"2019-07-05 09:21:10","endtime":"","elapsedtime":0}

  • export: This option is used to export the Diyotta objects in JSON file specification. You can export an entire project or layer or individual objects. Whenever an object is exported all the lower level objects that are used in it are also included in the exported file. Which means if a job flow is exported then all the jobs and data flows and it's associated objects like, data objects, data points, reusable expression will be included in the exported file. 

Method URL

<DiyottaURL>/diyotta-rest/export

    • <DiyottaURL>: URL used to connect to Diyotta's web user interface. 

HTTP method

GET

Headers

authorization

Specify the authorization key to authenticate the session created by RestFul API call.

Output file (optional)

Specify the name with which the exported json file should to saved. You can specify the path along with the file name where the exported file should be placed. Make sure the user with which the export option is being run has permission to write in the path specified. If path is not specified then the file will be saved in folder from where rest is being executed. If this parameter is not specified, then the exported json file will be printed on the console. 

Parameters

project

Specify the name of the project which needs to be referred when exporting. It is mandatory to provide this parameter when exporting any object. Only one project name can be specified with this parameter. If only project name parameter is provided then the entire project and the objects within it will be exported. 

layer (optional)

Specify the name of the layer which needs to be referred when exporting. If this parameter is provided then, it is mandatory to provide the project name also. Only one layer name can be specified with this parameter. If only layer name and project name is provided then the entire layer and the objects within it will be exported. It is mandatory to provide this parameter when exporting data flow or job flow. 

objecttype (optional)

Specify the type of object to be exported. This parameter is mandatory to be provided when exporting a specific object type. You can specify one of these types for this parameter.

dobjTo export data object datapointTo export data point seq - To export sequence 
expression - To export reusable expression udf - To export user defined functions datasubflow - To export data subflow
dataflow - To export data flow jobflow - To export job flow schtask - To export scheduler task
schcal - To export scheduler calendar schmail - To export scheduler email event schfile - To export scheduler file watcher event
scheduler - To export all tasks in scheduler projectparams - To export Project Parameters

dbtype (optional)

Specify the database type for the database object to be exported. This parameter is mandatory to be provided when exporting group level objects - data point, data object, sequence, expression and udf. You can specify one of these data types for this parameter.

NZ - Netezza TD - Teradata OR - Oracle FF - Flatfile PG - PostgreSQL DB - DB2
BI - Biginsights CO - Cobol SF - Salesforce HD - HDFS JS - JSON HB - Hadoop
MS - MSSQL SP - Splice Machine SS - SAS HV - Hive XD - XSD SY - Sybase
TW- Twitter FB - Facebook SK -  Spark JM - JMS KK - Kafka CS - Cassandra
TS - thoughspot SN - Snowflake MY - Mysql BQ - big Query RT - RestFul AV - Avro

conn (optional)

Specify the data point associated with the database object to be exported. This parameter is mandatory to be provided when exporting group level objects - data point, data object, sequence, expression and udf.

group (optional)

Specify the name of the group to which the database object to be exported belongs. This parameter is mandatory to be provided when exporting group level objects - data point, data object, sequence, expression and udf.

object (optional)

Specify the name of the object which needs to be exported. When exporting group level objects - data point, data object, sequence, expression and udf, it is mandatory to provide project name, object type, database type, data point name, and group name. When exporting layer level objects - data flow and job flow, it is mandatory to provide project name and layer name. When exporting scheduler objects it is mandatory to provide project name.

Privileges

The user requires "Studio Read" privilege to export studio objects and "Scheduler Read" privilege to export scheduler objects.

Example: Use Linux curl command to execute Diyotta Rest API call to export data flow

$ curl -X GET "https://dev.diyotta.com/diyotta-rest/export?project=Project_1&layer=Layer_1&objecttype=dataflow&object=df_hive_to_hive" -o jobflow2.json -H "Authorization:M2FThB8QQAsv8fTo9KxhnSmLsjXgI18POI0qCQDUYiE="

% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 104k 0 104k 0 0 129k 0 --:--:-- --:--:-- --:--:-- 129k

  • Import: This option is used to import the Diyotta Studio and Scheduler code in JSON format. The json file to be imported could have code corresponding to entire project or layer or individual objects. 

Method URL

<DiyottaURL>/diyotta-rest/export

    • <DiyottaURL>: URL used to connect to Diyotta's web user interface. 

HTTP method

POST

Headers

authorization

Specify the authorization key to authenticate the session created by RestFul API call.

Content-Type

Specify the content type in the input file. This has to be defaulted to multipart/form-data.

Form Data

file

Specify the file with the absolute path to be imported. The user running the rest command should have permission to read the file.

Parameters

project

Specify the name of the project into which the json file needs to be imported. It is mandatory to provide this parameter when importing any object. Only one project name can be specified with this parameter.

layer (optional)

Specify the name of the layer into which the json file needs to be imported. It is mandatory to provide this parameter when importing layer level objects - data flow and job flow. Only one layer name can be specified with this parameter.

option (optional)

Specify the import option for each type of object in the import json file. By default all the data points and project parameters if exist in Diyotta Studio will be reused and all the other objects if exist will be replaced. If the object does not exist in Diyotta then, it will be imported. This default behavior can be overridden by specifying how each object type should be imported. Specify the import option for each object type as comma separated key value pair. The import options can be replace or reuse and the allowed object type names are as below. 


all - Use this to specify the import option for all the objects in the json file. datapoint - Use this to specify the import option for all the data points in the json file.
dataobject - Use this to specify the import option for all the data objects in the json file. expression - Use this to specify the import option for all the expressions in the json file.
sequence - Use this to specify the import option for all the sequences in the json file. udf - Use this to specify the import option for all the UDFs in the json file.
subflow - Use this to specify the import option for all the data subflows in the json file. dataflow - Use this to specify the import option for all the data flows in the json file.
jobflow - Use this to specify the import option for all the job flows in the json file. param - Use this to specify the import option for all the project parameters in the json file.
schtaskcalendar - Use this to specify the import option for all the scheduler calendars in the json file. schtaskemail - Use this to specify the import option for all the scheduler email events in the json file.
schtaskfile - Use this to specify the import option for all the scheduler file event in the json file. schtask - Use this to specify the import option for all the scheduler tasks in the json file.

globalproject (optional)

Specify the name of the global project into which the global objects needs to be imported. It is mandatory to provide this parameter when json file includes global objects. Only one global project name can be specified with this parameter.

globallayer (optional)

Specify the name of the layer in the global project into which the global objects needs to be imported. It is mandatory to provide this parameter when json file includes global data objects. This parameter need not be provided when importing global objects at group level - data point, data object, sequence, reusable expression and udf. Only one layer name can be specified with this parameter.

Privileges

The user requires "Studio Write" privilege to perform this action.

Example 1: Use Linux curl command to execute Diyotta Rest API call to import job flow into project Project_1 using default import options.

$ curl -X POST -H "Content-Type: multipart/form-data" -F "file=@/home/disupport/jobflow2.json" "https://dev.diyotta.com/diyotta-rest/import?project=Project_1" -H "Authorization:M2FThB8QQAsv8fTo9KxhnSmLsjXgI18POI0qCQDUYiE="

{"code":"0","Message":"The specification file has been imported"}

Example 2: Use Linux curl command to execute Diyotta Rest API call to import job flow by specifying reuse import option for data objects. 

$ curl -X POST -H "Content-Type: multipart/form-data" -F "file=@/home/disupport/jobflow2.json" "https://dev.diyotta.com/diyotta-rest/import?project=Project_1&option="dataobject=reuse"" -H "Authorization:M2FThB8QQAsv8fTo9KxhnSmLsjXgI18POI0qCQDUYiE="

{"code":"0","Message":"The specification file has been imported"}

  • cleanup: This option is used to perform adhoc cleanup of Diyotta logs, temporary data files, temporary tables and operational logs. Diyotta generates logs and entries in metadata operational tables for each execution of the data flow or job flow. Similarly, temporary stage files and stage tables are not deleted for the failed execution of data flow or job flow. These need to be cleaned up periodically by running the cleanup option. 

Method URL

<DiyottaURL>/diyotta-rest/cleanup

    • <DiyottaURL>: URL used to connect to Diyotta's web user interface. 

HTTP method

POST

Headers

authorization

Specify the authorization key to authenticate the session created by RestFul API call.

Parameters

objecttype 

Specify the type of cleanup that needs to be performed. You can specify one of these types with this parameter.

tform – To drop temporary tables from assigned tform database/schema created during execution of data flow. local - To drop transient tables created during execution of data flow. stage - To delete left over data files from failed jobs created when extracting data from external systems.
applogs - To delete monitor log files created to log and display the progress of data flow and job flow execution. opsruns - To delete the entries from metadata operational tables created for data flow, job flow and scheduler task execution. srvlogs - To delete the Diyotta generated Controller and Agent logs.

project

Specify the name of the project which needs to be referred when cleaning up. Only one project name can be specified with this parameter. If this parameter is provided then the cleanup will be performed only for those tables and files that were created as part of execution of jobs in the project specified. If this parameter is not specified then cleanup will be performed irrespective of the project where the jobs were executed.

days (optional) 

Specify the number of days prior to which the files, tables and operation table entries created should be deleted. The number of days of history that needs to be shown in run history of Monitor and Scheduler is set in Diyotta Admin. Make sure the value specified with this parameter for the cleanup types applogs and opsruns is more than that defined in Admin.

Privileges

The User Requires "Studio Write" privilege to perform this action.

Example: Use Linux curl command to execute Diyotta Rest API call to cleanup the stage files older than 30 days from the project Project_1

$ curl -X POST "https://dev.diyotta.com/diyotta-rest/cleanup?project=Project_1&objecttype=tform&days=30" -H "Authorization:M2FThB8QQAsv8fTo9KxhnSmLsjXgI18POI0qCQDUYiE="

{"code":"0","Message":"Clean up completed successfully"}

  • scheduler: This option is used to pause and resume the entire scheduler or a specific task or task group in a project.

Method URL

<DiyottaURL>/diyotta-rest/scheduler

    • <DiyottaURL>: URL used to connect to Diyotta's web user interface. 

HTTP method

POST

Headers

authorization

Specify the authorization key to authenticate the session created by RestFul API call.

Parameters

operation

Specify if pause or resume operation needs to be performed. When paused then, the active tasks will complete the execution and wait for the scheduler to resume for next scheduled run to execute.

project (optional)

Specify the name of the project to which the task or task group specified belongs. It is mandatory to provide this parameter if task or task group name parameter is provided. Only one project name can be specified with this parameter.

taskname (optional)

Specify either task name or task group name to be paused or resumed. It is mandatory to provide the project name parameter if this parameter is provided.

Privileges

The user requires "Scheduler Execute" privilege to perform this action.

Example: Use Linux curl command to execute Diyotta Rest API call to pause a specific task in the scheduler.

$ curl -X POST "https://dev.diyotta.com/diyotta-rest/schedule?operation=pause&project=Project_1&taskname=t_Oracle_to_Hive_Order" -H "Authorization:M2FThB8QQAsv8fTo9KxhnSmLsjXgI18POI0qCQDUYiE="

{"code":"0","Message":"Scheduler Paused"}

  • refresh: This option is used to refresh the Diyotta Lineage component. If there are any modifications to the code in Diyotta Studio then the lineage needs to be refreshed for these changes to take effect.

Method URL

<DiyottaURL>/diyotta-rest/refresh

    • <DiyottaURL>: URL used to connect to Diyotta's web user interface. 

HTTP method

POST

Headers

authorization

Specify the authorization key to authenticate the session created by RestFul API call.

Parameters

None 

Privileges

The user requires "Lineage Read" and "Lineage Write" privilege for performing this action.

Example: Use Linux curl command to execute Diyotta Rest API call to refresh the lineage module and populate lineages.

$ curl -X POST "https://dev.diyotta.com/diyotta-rest/refresh" -H "Authorization:M2FThB8QQAsv8fTo9KxhnSmLsjXgI18POI0qCQDUYiE="

{"code":"0","Message":"Metaview tables have been refreshed","timestamp":"2019-06-19 07:46:29"}

  • createlayers: This option is used to add new layers to the existing projects using CLI. 

Method URL

<DiyottaURL>/diyotta-rest/createlayers

    • <DiyottaURL>: URL used to connect to Diyotta's web user interface. 

HTTP method

POST

Headers

authorization

Specify the authorization key to authenticate the session created by RestFul API call.

Parameters

project

Specify the name of the project to which the layers need to be added. Only one project name can be specified with this parameter.

layers

Specify the layer names which needs to be created. Multiple layer names can be provided separated by comma.

Privileges

The user requires "Administrator" privilege for performing this action.

Example: Use Linux curl command to execute Diyotta Rest API call to add layers to the project.

$ curl -X POST "https://dev.diyotta.com/diyotta-rest/createlayers?project=Project_1&layers=layer_src" -H "Authorization:M2FThB8QQAsv8fTo9KxhnSmLsjXgI18POI0qCQDUYiE="

{"code":"0","Message":"Layers are created"}

  • changeprojectparamvalue: This option is used to modify the value of a project parameter in a project. 

Method URL

<DiyottaURL>/diyotta-rest/changeprojectparamvalue

    • <DiyottaURL>: URL used to connect to Diyotta's web user interface. 

HTTP method

POST

Headers

authorization

Specify the authorization key to authenticate the session created by RestFul API call.

Parameters

project

Specify the name of the project in which the value of project parameter needs to be replaced. It is mandatory to provide this parameter. Only one project name can be specified with this parameter.

params

Specify the project parameter and the value that needs to be assigned to it.

Privileges

The user requires "Project Parameter Editor" privilege for performing this action.

Example: Use Linux curl command to execute Diyotta Rest API call to modify the project parameters in the project.


$ curl -X POST "https://dev.diyotta.com/diyotta-rest/changeprojectparamvalue?project=Project_1&params="\$PP_PRD_FLTR_PROJ=2"" -H "Authorization:M2FThB8QQAsv8fTo9KxhnSmLsjXgI18POI0qCQDUYiE="

{"code":"0","Message":"Project params are modified"}