Tutorial Videos

ELT Using Diyotta

Length: 8:10

Diyotta supports various ELT processing platforms be it Massively Parallel Processing data warehouses such as Teradata, Exadata, Hadood-based such as Hive, in-memory computing such as Spark, and cloud-based MPP platforms such as Redshift, Snowflake, and Google BigQuery.In this demonstration, we will review key features and concepts of how Diyotta supports ELT using push-down optimization, set-based instructions, attribute variations, and extract and load mechanisms.

DMBS Data Points Explained

Length: 3:14
This video will familiarize you with the fundamentals of a DBMS Data Point. We will review the concept of Standard sources such as the DBMS Data Point and their respective properties.

File Data Points Explained

Length: 2:16
This video will familiarize you with the fundamentals of a file-type Data Point. Diyotta connects to a multitude of data sources using native or generic connectivity, and Data Points are used to establish a relationship with the source, target or processing platforms. In this demonstration, we will review the concept of a File Data Point which connects to various file-type data sources such as COBOL, XML, JSON, fixed width and flat file.

Hadoop Data Points Explained

Length: 6:01
This video will familiarize you with the fundamentals of a Hadoop Data Point. Diyotta supports various Hadoop integrations such as HDFS files in Text, HBase, Avro, Parquet and Hive tables. In this demonstration, we will review the concept of Hadoop components such as Hive or HDFS, Spark, and their respective properties to set up data points. Hadoop and Spark work better together for processing and analyzing big data stored in HDFS.

Cloud/API Data Points Explained

Length: 2:33
This video will familiarize you with the fundamentals of a Cloud Data Point. Diyotta connects to a multitude of cloud and API based connectors using native API integration or via Rest API’s. In this demonstration, we will review the concept of setting up Cloud Data Points to pull data from Salesforce, Facebook, Twitter and Rest API.

Data Ingestion from Oracle to Hadoop

Length: 5:35
This video will familiarize you on the process to ingest data from Oracle to Hadoop in Diyotta. Diyotta provides an easy to use platform for Data Ingestion. It supports different standard data sources, for example, Oracle, MySQL, and Teradata.

Data Migration from MSSQL to Snowflake

Length: 5:53
This video will guide you through the fundamentals of data ingestion from Microsoft SQL to Snowflake using Diyotta. When moving data from on-premise systems to cloud, one of the key challenges is how to migrate data smoothly and securely. Diyotta provides an easy to use platform and enables data migration from various sources such as the Relational database, files or data warehouses on to cloud instances of Snowflake, Redshift and Google BigQuery seamlessly and securely.

External Data Ingestion into BigQuery

Length: 5:48
This video will guide you through the fundamentals of data ingestion from the external database to BigQuery using Diyotta. Diyotta provides an easy to use platform and enables data migration from various sources such as the relational database, File or data warehouses on to cloud instances of Snowflake, Redshift and Google BigQuery seamlessly and securely.

Data Ingestion from MSSQL to Redshift

Length: 5:38
This video will familiarize you with the fundamentals of data ingestion using the Diyotta's Data Movement Wizard. This built-in wizard allows users to quickly build data flow and move data from supported sources to any target.

Simple ELT Transformations on Hadoop

Length: 6:21
This video will familiarize you with the fundamentals of data processing using data from Oracle data sources into Hadoop in Diyotta.

Simple ELT Transformations on Snowflake

Length: 5:48
This video will familiarize you with the fundamentals of data processing on Snowflake in Diyotta. In this demonstration, we will use ingested data for performing simple transformations before finally placing the processed data into a target table within Snowflake

Simple ELT Transformations on BigQuery

Length: 6:58
This video will familiarize you with the fundamentals of data processing on Google BigQuery in Diyotta. In the Data Ingestion tutorial, we demonstrated how to ingest external data into a Google BigQuery environment. In this demonstration, we will use that ingested data to perform simple transformations and place the processed data into a target table within BigQuery.

Advanced Features of the Data Movement Wizard

Length: 5:39
This video will guide you through the fundamentals of data ingestion from the external database to BigQuery using Diyotta. Diyotta provides an easy to use platform and enables data migration from various sources such as the relational database, File or data warehouses on to cloud instances of Snowflake, Redshift and Google BigQuery seamlessly and securely.

Real-Time Data Streaming

Length: 5:10
This video will familiarize you with the fundamentals of real-time data streaming on Spark 2.2.0 using Diyotta. Diyotta supports various real-time data sources such as Kafka, Spark Streaming, and JMS. In this demonstration, we will review an application that consumes weblog event from Kafka, parse the event data to identify the successful served requests versus failed requests, and generate statistics on request types and produce the results of computation as events to sink.

Web Log Analysis on Hive

Length: 5:56

Diyotta has the ability to easily extract, transform and load data into target platforms optimally to gain complete visibility. In this demonstration, we will review an example of unstructured data of the Server Log file in Hadoop and process it through Spark to produce useful insights and load the data into a Cassandra data object.

Web Log Analysis on Spark

Length: 6:44
This video will familiarize you with the fundamentals of processing unstructured data of log files on the Spark processing platform. One of the prevailing industry scenarios in the modern data landscape is that a massive amount of data resides in Data Lakes. In this demonstration, we will review an example of unstructured data of the Server Log file in Hadoop and process it through Spark to produce useful insights and load the data into a Cassandra data object.

Change Data Capture in Diyotta

Length: 3:37
This video will familiarize you with the fundamentals of Change Data Capture using Data Connect in Diyotta. Diyotta provides a Data Connect wizard to configure the Change Data Capture process in order to capture live changes from source databases such as Amazon Aurora or MySQL to sink databases such as Redshift or Postgres.In this demonstration, we will walk through the concepts of the Data Connect wizard.