• Datacoves

  • Learning Resources

  • Our git repos

  • dbt docs

Datacoves Docs

  • Home
  • Getting Started
  • Administrator
    • Account Pre-reqs
    • Configure Airflow
    • Configure Git Repository
    • Creating Airflow Dags
    • User Management
  • Developer
    • Snowflake Extension
    • Transform Tab
    • Working with dbt in Datacoves
    • Using Git
  • Diving Deeper
  • How to
    • Airflow
      • Airflow - Initial setup
      • Airflow - Accessing the Airflow API
      • Airflow - Sync Internal Airflow database
      • Airflow - Trigger a DAG using Datasets
      • DAGs - Add Dag Documentation
      • DAGs - Calling External Python Scripts
      • DAGs - Dynamically Set Schedule
      • DAGs - Generate DAGs from yml
      • DAGs - Get Current Git Branch Name from a DAG Task
      • DAGS - Load from S3 to Snowflake
      • DAGs - Run ADF Pipelines
      • DAGs - Run Airbyte sync jobs
      • DAGs - Run dbt
      • DAGs - Run Databricks Notebooks
      • DAGs - Run Fivetran sync jobs
      • DAGs - Test DAGs
      • DAGs - Using Variables and Connections
      • Notifications - Send Emails
      • Notifications - Send Microsoft Teams notifications
      • Notifications - Send Slack notifications
      • Secrets - AWS Secrets Manager
      • Secrets - Datacoves Secrets Manager
      • Worker - Custom Worker Environment
      • Worker - Request Memory and CPU
    • Datacoves
      • Configure Connection Templates
      • Configure Datacoves Secret
      • Configure Environments
      • Configure Groups
      • Configure Integrations
      • Configure Invitations
      • Configure Projects
        • Configure AWS Secrets Manager
        • Configure Azure DevOps
          • Create your EntraID App
          • Add EntraID App to DevOps Portal
          • Gather DevOps Auth Details
          • Authenticate Azure DevOps
      • Configure Service Connections
      • Manage Users
      • Update Repository
    • Datahub
      • Configure dbt metadata ingestion
      • Configure Snowflake metadata ingestion
      • Manage datahub using CLI
    • DataOps
      • Releasing a new feature
    • dbt
      • Advanced Debugging
      • Compilation Errors
      • Database Errors
      • Dependency Errors
      • Runtime Errors
    • Git
      • SSH Keys configuration
    • Metrics & Logs
      • Git Sync/S3 Failures
    • My Airflow
      • Migrating from Environment Service Connections
      • My Import
      • Use My Airflow
    • Snowflake
      • Warehouses, Schemas and Roles
    • Superset
      • Add a Database
      • Add a Dataset
    • VS Code
      • Datacoves Copilot
        • Config
        • Usage
      • Initial Configuration
        • BigQuery
        • Databricks
        • Redshift
        • Snowflake
      • Custom Environment Variables
      • Override VS Code settings
      • Reset User Env
      • Reset Git
  • Explanation
    • Best Practices
      • Datacoves
        • Folder Structure
      • dbt
        • dbt Guidelines
        • Object Naming Standards
        • What are Inlets, Bays, and Coves
      • Git
      • Snowflake
        • Security Model
        • GDPR and Time-Travel
  • Reference
    • Administration Menu
      • Account Settings & Billing
      • Connection Templates
      • Environments
      • Groups
      • Integrations
      • Invitations
      • Projects
      • Secrets
      • Service Connections
      • Users
    • Airflow
      • Airflow Best Practices
      • Airflow Config Defaults
      • Airflow Variables
      • DAG Generators
      • Datacoves Airflow Decorators
      • Datacoves CLI Commands
      • Datacoves Environment Service Connection Variables
      • Datacoves Operators
    • Datacoves
      • Versioning
      • VPC Deployment
    • Metrics & Logs
      • Grafana
    • Security
    • VS Code
      • CSVs
      • Datacoves Environment Variables
      • VS Code Tips
  • Tutorials
  • Platform
  • Status Tracker
  • SLA
Copyright © 2021-2024 datacoves.com
Policy Terms

Edit on github

Airflow Reference