Develop workflows with Apache Airflow!
Have you ever scheduled a chain of cron jobs only to discover down the line that data was missing? Apache Airflow eliminates that problem. Learn to author, schedule, and monitor workflows through hands-on experience with the leading open source platform in the space.
“It was a hands-on and tangible course. We could apply what we learned in a matter of minutes. The trainer did a great job of answering ad-hoc questions that complemented the material. We appreciated the fact that we could apply what we were taught directly to our company.” —Technical Leader & Software Architect, bol.com
What you'll learn
- The rundown of Apache Airflow user interface
- How to create and monitor DAGs
- The basics of using the most critical operators
- How to create dynamic workflows with branching
- How to communicate with external systems, using hooks and connections
- How to trigger your workflows with sensors
The program consists of both theory and hands-on exercises.
- The essential components of Apache Airflow
- Running and managing workflows
- Creating dynamic workflows with Jinja templating
- Sharing state between tasks with XComs
Climbing a steep Python and Machine Learning curve in three days. This would have taken me months on my own.
This online course is perfect for
The Apache Airflow course is aimed at Data Scientists and Data Engineers who want to bring their workflows to production. If you’re going to learn the best practices for monitoring, controlling, and running your data pipelines with Airflow, this course is the best way to do so! To get the most out of the day, we recommend you have at least one year of experience working with Python in the data field. You should know how to communicate with databases and understand different file formats, such as Parquet and JSON.
What will you learn during Apache Airflow training?
You will learn the terminology and best practices of writing directed acyclic graphs (DAGs) in Apache Airflow and gain hands-on experience with writing and maintaining data-driven workflows. You will be able to confidently set-up production-quality pipelines.
The Learning Journey for Data Engineers
Learn how to take data and AI concepts from concept to prototype and to production-ready application. Acquire the skills to develop and run Data and AI solutions at an enterprise-scale with ease! Take part in a specific training or advance through the entire journey. Learn how to build secure data platforms and reliable AI applications that are engineered for scale.
The Right Format For Your Preferred Learning Style
At GoDataDriven we offer four distinct training modalities:
- In-Classroom & In-Company Training
- Online, Instructor-Led Training
- Hybrid and Blended Learning
- Self-Paced Training