📄️ Airflow - What to know
- Ruff is installed to show unused imports and unused variables as well as python linting.
📄️ Airflow - Initial setup
Turn on Airflow
📄️ Airflow - Accessing the Airflow API
Users must have Project Level Admin Group to use the Airflow API. The API will allow you to view secrets values in plain text. Always exercise the principle of least privilege.
📄️ Airflow - Sync Internal Airflow database
It is now possible to synchronize the Datacoves Airflow database to your Data Warehouse
📄️ Airflow - Trigger a DAG using Datasets
Overview
📄️ Airflow - Use Key-Pair Authentication
This documentation will presume you have knowledge of Datacoves' Service Connections and how to configure them.
🗃️ DAGs
14 items
📄️ Notifications - Send Emails
Getting notifications when there is a failure is critical for data teams and Airflow allows multiple ways to keep users informed about the status of a DAG.
📄️ Notifications - Send Microsoft Teams notifications
As stated in how to send email notifications, Airflow allows multiple ways to inform users about DAGs and tasks status.
📄️ Notifications - Send Slack notifications
As stated in how to send email notifications, Airflow allows multiple ways to inform users about DAGs and tasks status.
📄️ Secrets - AWS Secrets Manager
Datacoves integrates with the Airflow Secrets Backend Interface, offering support for both its native Datacoves Secrets Backend and AWS Secrets Manager. For other Airflow-compatible Secrets Managers, please reach out to us.
📄️ Secrets - Datacoves secrets manager
Datacoves includes a built-in Secrets Manager that allows you to securely store and manage secrets for both administrators and developers. Secrets can be stored at the project or environment level and easily shared across other tools in your stack, ensuring seamless integration and enhanced security. Creating or editing a secret in the Datacoves Secret Manager is straightforward. Be sure to prefix all secrets stored in Datacoves Secrets Manager with datacoves-.
📄️ Worker - Custom Worker Environment
If you need to run tasks on Airflow on a custom environment that comes with pre-installed libraries and tools, we recommend building your own custom docker image, upload it to a docker image repository such as dockerhub and reference it in your DAG's task operator.
📄️ Worker - Request Memory and CPU
Sometimes you need to run tasks that require more memory or compute power. Airflow task's definition that use a kubernetes execution environment allow for this type of configuration.