Data Pipelines with Apache Airflow
If you need an efficient data pipeline there is no better tool than Apache Airflow. Our GoDataDriven engineers — Bas Harenslak and Julian de Ruiter — wrote the book Data Pipelines with Apache Airflow which shows you how to simplify and automate data pipelines, reduce operational overhead, and smoothly integrate all the technologies in your stack. They use their consulting experience from companies like Heineken, Unilever and Booking.com to present relevant use cases and applications.
But what is Airflow, what kind of needs does it fill, what are the challenges you need to expect when starting with Airflow?
In this on-demand webinar Giovanni will ask everything about Apache Airflow to expert Bas Harenslak. For example:
- What’s new in Airflow 2.0?
- What kind of needs does Airflow solve?
- What’s best practice to split Airflow DAGs?
- What’s coming up in Airflow 3.0?
Watch this webinar on-demand now
Airflow is an open source platform to author, schedule, and monitor workflows. First open sourced by Airbnb, it is now part of the Apache Software Foundation.
Written in Python, Airflow helps you manage workflows, architecting and orchestrating complex data pipelines. with few lines of code.
Thanks to its vibrant community, Airflow comes with integration with cloud technologies from AWS, GCP, and Azure and is available as a service from a number of companies, including Astronomer.io.
About Bas Harenslak
Bas is a Solutions Architect at Astronomer. Astronomer helps organizations adopt Apache Airflow®, the leading open-source data workflow orchestration platform that helps organizations get their data in motion.