Develop workflows with Apache Airflow!
Have you ever scheduled a chain of cron jobs only to discover down the line that data was missing? Apache Airflow eliminates that problem. Learn to author, schedule, and monitor workflows through hands-on experience with the leading open source platform in the space.
“It was a hands-on and tangible course. We could apply what we learned in a matter of minutes. The trainer did a great job of answering ad-hoc questions that complemented the material. We appreciated the fact that we could apply what we were taught directly to our company.” —Technical Leader & Software Architect, bol.com
Clients we've helped
What you'll learn
- The rundown of Apache Airflow user interface
- How to create and monitor DAGs
- The basics of using the most critical operators
- How to create dynamic workflows with branching
- How to communicate with external systems, using hooks and connections
- How to trigger your workflows with sensors
The program consists of both theory and hands-on exercises.
- The essential components of Apache Airflow
- Running and managing workflows
- Creating dynamic workflows with Jinja templating
- Sharing state between tasks with XComs
Data Engineering Learning Journey
Get certified in Apache Airflow Fundamentals
Apache Airflow is the leading orchestrator for authoring, scheduling, and monitoring data pipelines. It has quickly become an invaluable asset in any data professional’s toolbox.
Receiving an Astronomer Certification for Apache Airflow Fundamentals demonstrates your knowledge of Airflow’s core concepts and your ability to make wise architectural decisions, understand applied use cases, and design data pipelines.
Once you pass the exam, you will receive an official certificate that you can show off to your organization and peers alike. Here you can find more information about the certification.
Kris GeusebroekBig Data Hacker and Trainer
Kris is a seasoned and communicative developer with a passion for combining technologies to create new possibilities for the people around him. He started developing with Java and gained vast experience with the development of Geographical Information Systems. Over time, Kris gradually developed a passion for open source solutions.
Over the past years, Kris has been working with distributed systems and graph databases, like Hadoop and Neo4J, for large enterprises.
Clients include: Rabobank, Wehkamp, Dutch National Police, ING, KNAB, Schiphol, ABN AMRO, and Technische Unie
Apache Airflow Training
The Right Format For Your Preferred Learning Style
Structured, to-the-point, good combination of theory and practical examples, very knowledgeable trainer who can explain concepts very well
It was a hands-on and tangible course. We could apply what we learned in a matter of minutes. The trainer did a great job of answering ad-hoc questions that complemented the material. We appreciated the fact that we could apply what we were taught directly to our company.
I liked every aspect of this training and would like to thank the trainers. They did an excellent job of explaining how to use Spark for data science. This is the fourth GoDataDriven training I’ve followed. All were great, but this was the best one so far.
Climbing a steep Python and Machine Learning curve in three days. This would have taken me months on my own.