Udemy

Databricks - Master Azure Databricks for Data Engineers

Enroll Now
  • 23,129 Students
  • Updated 1/2024
4.6
(2,931 Ratings)
CTgoodjobs selects quality courses to enhance professionals' competitiveness. By purchasing courses through links on our site, we may receive an affiliate commission.

Course Information

Registration period
Year-round Recruitment
Course Level
Study Mode
Duration
17 Hour(s) 34 Minute(s)
Language
English
Taught by
Learning Journal, Prashant Kumar Pandey
Rating
4.6
(2,931 Ratings)
1 views

Course Overview

Databricks - Master Azure Databricks for Data Engineers

Learn Azure Databricks for professional data engineers using PySpark and Spark SQL with an end-to-end capstone project

About the Course

I am creating Databricks - Master Azure Databricks for Data Engineers using the Azure cloud platform. This course will help you learn the following things.


  1. Databricks in Azure Cloud

  2. Working with DBFS and Mounting Storage

  3. Unity Catalog - Configuring and Working

  4. Unity Catalog User Provisioning and Security

  5. Working with Delta Lake and Delta Tables

  6. Manual and Automatic Schema Evolution

  7. Incremental Ingestion into Lakehouse

  8. Databricks Autoloader

  9. Delta Live Tables and DLT Pipelines

  10. Databricks Repos and Databricks Workflow

  11. Databricks Rest API and CLI

Capstone Project

This course also includes an End-To-End Capstone project. The project will help you understand the real-life project design, coding, implementation, testing, and CI/CD approach.

Who should take this Course?

I designed this course for data engineers who are willing to develop Lakehouse projects following the Medallion architecture approach using the Databrick cloud platform. I am also creating this course for data and solution architects responsible for designing and building the organization’s Lakehouse platform infrastructure. Another group of people is the managers and architects who do not directly work with Lakehouse implementation. Still, they work with those implementing Lakehouse at the ground level.

Spark Version used in the Course.

This course uses Databricks in Azure Cloud and Apache Spark 3.5. I have tested all the source codes and examples used in this course on Azure Databricks Cloud using Databricks Runtime 13.3.

Course Content

  • 10 section(s)
  • 91 lecture(s)
  • Section 1 Before you start
  • Section 2 Introduction
  • Section 3 Getting Started
  • Section 4 Working in Databricks Workspace
  • Section 5 Working with Databricks File System - DBFS
  • Section 6 Working with Unity Catalog
  • Section 7 Working with Delta Lake and Delta Tables
  • Section 8 Working with Databricks Incremental Ingestion Tools
  • Section 9 Working with Databricks Delta Live Tables (DLT)
  • Section 10 Databricks Project and Automation Features

What You’ll Learn

  • Databricks in Azure Cloud
  • Working with DBFS and Mounting Storage
  • Unity Catalog - Configuring and Working
  • Unity Catalog User Provisioning and Security
  • Working with Delta Lake and Delta Tables
  • Manual and Automatic Schema Evolution
  • Incremental Ingestion into Lakehouse
  • Databricks Autoloader
  • Delta Live Tables and DLT Pipelines
  • Databricks Repos and Databricks Workflow
  • Databricks Rest API and CLI
  • Capstone Project

Reviews

  • K
    Kiran Vairagade
    5.0

    very nice. I like the way of teaching from scratch even for the simple things.

  • T
    Thendral R
    5.0

    The lectures provide exactly the knowledge I needed and expected.

  • M
    Michael Garrett
    4.5

    Very clear instructions - also like how the sections are ordered

  • B
    BHANU PRAKASH KARNI
    5.0

    great session on delta tables optimization and very interesting

Start FollowingSee all

We use cookies to enhance your experience on our website. Please read and confirm your agreement to our Privacy Policy and Terms and Conditions before continue to browse our website.

Read and Agreed