Airflow features a comprehensive testing infrastructure encompassing multiple testing methodologies designed to ensure reliability and functionality across different deployment scenarios and integrations. The testing framework includes:
- Unit tests are Python tests that do not require any additional integrations. Unit tests are available both in the Breeze environment and local virtualenv. Note that in order for a pull request to be reviewed, it should have a unit test, unless a test is not required i.e. for documentation changes.
- Integration tests are available in the Breeze environment that is also used for Airflow CI tests. Integration tests are special tests that require additional services running, such as Postgres, MySQL, Kerberos, etc.
- Docker Compose tests are tests we run to check if our quick start docker-compose works.
- Kubernetes tests are tests we run to check if our Kubernetes deployment and Kubernetes Pod Operator works.
- Helm unit tests are tests we run to verify if Helm Chart is rendered correctly for various configuration parameters.
- System tests are automatic tests that use external systems like Google Cloud and AWS. These tests are intended for an end-to-end DAG execution.
You can also run other kinds of tests when you are developing airflow packages:
- Testing packages is a document that describes how to manually build and test pre-release candidate packages of airflow and providers.
- Python client tests are tests we run to check if the Python API client works correctly.
- DAG testing is a document that describes how to test DAGs in a local environment
with
DebugExecutor
. Note, that this is a legacy method - you can now use dag.test() method to test DAGs.
You can learn how to build documentation as you will likely need to update documentation as part of your PR.
You can also learn about working with git as you will need to understand how git branching works and how to rebase your PR.