Apache Airflow Training

1-Day Training

Develop workflows with Apache Airflow!

Have you ever scheduled a chain of cron jobs only to discover down the line that data was missing? Apache Airflow eliminates that problem. Learn to author, schedule, and monitor workflows through hands-on experience with the leading open source platform in the space.

“It was a hands-on and tangible course. We could apply what we learned in a matter of minutes. The trainer did a great job of answering ad-hoc questions that complemented the material. We appreciated the fact that we could apply what we were taught directly to our company.” —Technical Leader & Software Architect,

Register Now Through the Xebia Academy Website

You will be redirected to the Xebia Academy Website for registration

Register Now

This online course is perfect for

The Apache Airflow course is aimed at Data Scientists and Data Engineers who want to bring their workflows to production. If you’re going to learn the best practices for monitoring, controlling, and running your data pipelines with Airflow, this course is the best way to do so! To get the most out of the day, we recommend you have at least one year of experience working with Python in the data field. You should know how to communicate with databases and understand different file formats, such as Parquet and JSON.

What will you learn during Apache Airflow training?

You will learn the terminology and best practices of writing directed acyclic graphs (DAGs) in Apache Airflow and gain hands-on experience with writing and maintaining data-driven workflows. You will be able to confidently set-up production-quality pipelines.

The Program

The program consists of both theory and hands-on exercises.

  • The essential components of Apache Airflow
  • Running and managing workflows
  • Creating dynamic workflows with Jinja templating
  • Sharing state between tasks with XComs

You will be redirected to the Xebia Academy Website for registration

Course details

You will learn:

  • The rundown of Apache Airflow user interface
  • How to create and monitor DAGs
  • The basics of using the most critical operators
  • How to create dynamic workflows with branching
  • How to communicate with external systems, using hooks and connections
  • How to trigger your workflows with sensors

Download Training brochure

Download the GoDataDriven brochure for a complete overview of available training sessions and data engineering, data science, and analytics translator learning journeys.

Download Brochure

Training Formats

This training is available in the following formats:

In-Company Classroom

In-Company training is perfect for groups of 6 or more. The training takes place online, at your office, or at one of our modern training facilities.

Online Virtual Classroom

Virtual Classrooms provide you with an interactive environment to effectively develop your skills, right from the comfort of your own home or office.

Data Science Engineering Journey

This data engineering learning journey is available for any data experts. Our extensive training programs are designed to develop your skills from junior to senior.

How do you become a data engineering expert? Start here! We’ve put together a carefully crafted learning journey for data engineers. Knowing engineers love to figure things out on their own, we packed the program with opportunities to learn, hands-on, by solving real-life situations. Plus, there’s plenty of practical philosophy, too.

We’ll teach you how to leverage Docker to ease your deployments and navigate code written by data scientists ( Advanced Python and Data Science in Production). You will learn to use Apache Airflow, Apache Spark, and Kafka like a forklift to move data around.

Click here for more information about the Learning Journey for Data Engineers

GoDataDriven - Data Engineer Learning Journey

Our latest insights

See all
More information

Any questions? Please get in touch!

Contact Gert-Jan Steltenpool, our Sales Director, if you want to know more. He’ll be happy to help you!