Azure Databricks administration - ETL Workflow

Azure Databricks administration – ETL Workflow

Description

In this Course, you will learn about spark based Azure Databricks, with more and more data growing daily take it any source be it in csv, text , JSON or any other format the consumption of data has been really easy via different IoT system. mobile phones internet and many other devices.

Azure Databricks provides the latest versions of Apache Spark and allows you to seamlessly integrate with open source libraries. Spin up clusters and build quickly in a fully managed Apache Spark environment with the global scale and availability of Azure.

Here is the 30,000 ft. overview of the agenda of the course, what will you learn and how you can utilise the learning into a real world data engineering scenario, this course is tailor made for some one who is coming from a background with no prior knowledge of Databricks and Azure and would like to start off their career in data world, specially around administering Azure Databricks.

Prepare for interviews and certification by solving quizzes at the end of sessions.

1. What is Databricks?

2. Databricks Components:

    a. Notebook

    b. Clusters

    c. Pool

    d. Secrets

    e. Databricks CLI

    f. Cluster Policy

3. Automate the entire administration activity via Terraform.

4. Mount Azure Blob Storage with Databricks.

5. Automate mount Azure Blob Storage with Databricks.

6. Load CSV data in Azure blob storage

7. Transform data using Scala and SQL queries.

8. Load the transform data into Azure blob storage.

9. Understand about Databricks tables and filessystem.

10. Configure Azure Databricks logging via Log4j and spark listener library via log analytics workspace.

11. Configure CI CD using Azure DevOps.

12. Git provider intergration.

13. Configure notebook deployment via Databricks Jobs.

Leave a Reply