Delivering data on a schedule can be a manual process. You write scripts, add complex cron tasks, and try various ways to meet an ever-changing set of requirements—and it’s even trickier to manage everything when working with teammates. Airflow can remove this headache by adding scheduling, error handling, and reporting to your workflows. In this course, you’ll master the basics of Airflow and learn how to implement complex data engineering pipelines in production. You'll also learn how to use Directed Acyclic Graphs (DAGs), automate data engineering workflows, and implement data engineering tasks in an easy and repeatable fashion—helping you to maintain your sanity.
- create dag in Airflow 2.3 way
- run airflow in k8s with KubernetesCelery executor