📄️ DAGs - Add Dag Documentation
Overview
📄️ DAGs - Calling External Python Scripts
If you need additional libraries for your DAG such as pandas, let us know so that we can configure them in your environment.
📄️ DAGs - Dynamically Set Schedule
By default, DAGs are created with a paused state in Airflow, but you can change this with the ispausedon_creation=True option. However, you will likely not want to schedule DAGs in a development Airflow instance. The steps below describe how do not set a schedule in a Development Airflow instance.
📄️ DAGs - Generate DAGs from yml
You have the option to write out your DAGs in python or you can write them using yml and then have dbt-coves generate the python DAG for you.
📄️ DAGs - Get Current Git Branch Name from a DAG Task
In Airflow, Datacoves will place your repo into /opt/airflow/dags/
📄️ DAGS - Load from S3 to Snowflake
Schema Evolution
📄️ DAGs - Run ADF Pipelines
You can use Airflow in Datacoves to trigger a Microsoft Azure Data Factory pipeline. This guide will walk you through the configuration process.
📄️ DAGs - Run Airbyte sync jobs
In our quest to simplify the way tools integrate in the Modern Data Stack, we developed the generate airflow-dags command in the dbt-coves library.
📄️ DAGs - Run dbt
Airflow synchronizes a git repository's configured git branch every minute. (The branch specified in the Git branch name field in the environment's DAGs sync configuration)
📄️ DAGs - Retry dbt jobs
Overview
📄️ DAGs - Run Databricks Notebooks
You can use Airflow in Datacoves to trigger a Databricks notebook. This guide will walk you through the configuration process.
📄️ DAGs - Run Fivetran sync jobs
In Addition to triggering Airbyte loads jobs run Airbyte sync jobs you can also trigger Fivetran jobs from your Airflow DAG.
📄️ DAGs - Test DAGs
In Datacoves you can easily test your Airflow DAGs using pytest in the command line. However you can also run these validations in your CI/CD pipeline.
📄️ DAGs - Using Variables and Connections
dbt-coves generate airflow-dags does not support reading variables/connections, but you may generate the initial Python Airflow DAG and add the connection / variable information.