WEBINAR

Stopping bad data before it breaks your Airflow pipelines

March 4, 2026 | 11am ET / 4pm GMT

  • Kenten Danas
  • Marc Lamberti
  • Volker Janz

Bad data in, bad decisions out. Data quality is fundamental to trustworthy analytics and AI. Silent data issues, missing records, schema changes, and duplicate entries can all go undetected until dashboards break or models fail. By then, the damage is done.

Fortunately, you have many options with Airflow for ensuring bad data never makes it to production. Dag-level checks run as part of your pipeline, and can stop downstream tasks if a check fails. Platform-level checks run within your orchestration platform, independently of Dag execution, and can give you a bigger picture for ongoing monitoring, even when there is a problem with Airflow.

In this webinar, we’ll cover everything you need to know about both approaches to data quality, including:

  • When to choose Dag-level checks, platform-level checks, or both
  • Available data quality check operators in Airflow and how to use them
  • Platform-level data quality with Astro Observe, including custom SQL monitors, table-level lineage, and more

Save Your Spot Today

Build, run, & observe your data workflows.
All in one place.

Build, run, & observe
your data workflows.
All in one place.

Try Astro today and get up to $20 in free credits during your 14-day trial.