ETL Part 3 - Production — 1 user / 1 year

ETL Part 3 - Production — 1 user / 1 year

Regular price
Sale price


In this course data engineers optimize and automate Extract, Transform, Load (ETL) workloads using stream processing, job recovery strategies, and automation strategies like REST API integration. By the end of this course you will schedule highly optimized and robust ETL jobs, debugging problems along the way.


2-4 hours, 75% hands-on


The course is a series of six self-paced lessons available in both Scala and Python. A final capstone project involves refactoring a batch ETL job to a streaming pipeline. In the process, students run the workload as a job and monitor it. Each lesson includes hands-on exercises.


Supported platforms include Azure Databricks and AWS Databricks.

Note: This course will not run on Databricks Community Edition.

  • If you're planning to use the course on Azure Databricks, select the "Azure Databricks" Platform option.
  • If you're planning to use the course on Databricks Community Edition or on a non-Azure version of Databricks, select the "Other Databricks" Platform option.

Learning Objectives

During this course you:

  • Perform an ETL job on a streaming data source
  • Parameterize a code base and manage task dependencies
  • Submit and monitor jobs using the REST API or Command Line Interface
  • Design and implement a job failure recovery strategy using the principle of idempotence
  • Optimize ETL queries using compression and caching best practices with optimal hardware choices


    • ETL Part 1 self-paced course (optional, but strongly encouraged)
    • ETL Part 2 self-paced course (optional, but strongly encouraged)


    1. Course Overview and Setup
    2. Streaming ETL
    3. Runnable Notebooks
    4. Scheduling Jobs
    5. Job Failure
    6. ETL Optimizations
    7. Capstone Project

    Lab Requirements

    License Limitations

    This self-paced training course may be used by 1 user for 12 months from the date of purchase.  It may not be transferred or shared with any other user.


    The use of the self-paced training course is subject to the Terms of Service and the Databricks Privacy Policy.