Airflow ships a core CLI; Astronomer adds the astro CLI on top for project scaffolding and deployment. Together they cover most operational tasks.

Astro CLI

Project lifecycle

astro dev init              # Scaffold a new project in the current directory
astro dev start             # Start a local Airflow stack in Docker
astro dev stop              # Stop containers, preserve state
astro dev restart           # Stop and start in one command
astro dev kill              # Stop and wipe all state
astro dev logs              # Tail logs from local containers
astro dev logs --scheduler  # Filter to one component
astro dev pytest            # Run the project's pytest suite
astro dev parse             # Parse all DAGs; fails on parse errors

Running commands inside Airflow

Local:

astro dev run <airflow-subcommand> <args…>

Examples:

astro dev run dags list
astro dev run dags trigger my_dag
astro dev run dags test my_dag 2026-04-21
astro dev run tasks test my_dag my_task 2026-04-21
astro dev run variables set my_var my_value
astro dev run connections add aws_default --conn-type aws

astro dev run forwards to the airflow CLI inside the running scheduler container.

Deployment

astro login                          # Authenticate against Astro
astro deployment list                # List deployments in your workspace
astro deploy                         # Deploy the current project
astro deploy --deployment-id <id>    # Deploy to a specific deployment
astro deploy --dags                  # Deploy only the dags/ folder (faster)
astro deploy --prompt                # Prompt before deploying

Note

astro deploy --dags is the fast path for DAG-only changes. If requirements.txt, packages.txt, or Dockerfile changed, do a full astro deploy; the image is rebuilt and redeployed.

Environment management

# Set or read environment variables on a deployment
astro deployment variable list --deployment-id <id>
astro deployment variable create --deployment-id <id> \
  KEY=value AIRFLOW__CORE__PARALLELISM=64

# Set secrets (redacted in the UI)
astro deployment variable create --deployment-id <id> \
  --secret DATABRICKS_TOKEN=dapi...

Airflow core CLI

DAGs

airflow dags list                                # All DAGs
airflow dags list-runs --dag-id my_dag          # Recent runs
airflow dags state my_dag 2026-04-21            # State of one run
airflow dags trigger my_dag                      # Trigger a run
airflow dags pause my_dag                        # Pause
airflow dags unpause my_dag                      # Unpause
airflow dags backfill my_dag \                  # Backfill range
  --start-date 2026-04-20 --end-date 2026-04-21
airflow dags test my_dag 2026-04-21             # Test without recording

Tasks

airflow tasks list my_dag                        # Tasks in a DAG
airflow tasks states-for-dag-run my_dag <run_id>
airflow tasks test my_dag my_task 2026-04-21    # Run one task locally
airflow tasks log my_dag my_task 2026-04-21 --try-number 1
airflow tasks run my_dag my_task 2026-04-21     # Run, record, respect retries

Assets (Airflow 3)

airflow assets list                              # All registered assets
airflow assets show <asset_uri>                  # One asset's metadata
airflow assets list-consumers <asset_uri>        # DAGs scheduled on it
airflow assets update --uri <asset_uri> \        # Manual update (rare)
  --extra '{"reason": "manual backfill"}'

Connections and variables

# Connections
airflow connections list
airflow connections get aws_default
airflow connections test aws_default
airflow connections add databricks_default \
  --conn-type databricks \
  --conn-host adb-1234.cloud.databricks.com
airflow connections delete aws_default

# Variables
airflow variables list
airflow variables get my_key
airflow variables set my_key my_value
airflow variables delete my_key

# Export / import
airflow variables export variables.json
airflow variables import variables.json

Warning

Do not commit variables.json or connection exports to Git if they contain secrets. Use Astronomer's environment variables, AWS Secrets Manager backend, or HashiCorp Vault. airflow variables export includes secret values in plaintext.

Pools

airflow pools list                               # Show all pools
airflow pools get salesforce_api                 # One pool
airflow pools set salesforce_api 5 \             # Create or update
  "Concurrent SF API calls"
airflow pools delete salesforce_api

DB operations

airflow db check                                 # Connectivity test
airflow db check-migrations                      # Pending migrations
airflow db upgrade                               # Apply migrations
airflow db reset                                 # DANGEROUS: wipes metadata

Danger

airflow db reset wipes the metadata database: every DAG run history, every connection, every variable. It is a development command; never run it against a production metadata DB.

Users (legacy; RBAC supersedes)

airflow users list
airflow users create --role Admin ...
airflow users delete --username ...

On Astronomer, user management is via the Astro UI, not the Airflow CLI.

Airflow REST API

For things the CLI does not cover, hit the REST API directly. Auth via bearer token or Astro's service-principal auth.

Get a token (Astronomer)

TOKEN=$(astro workspace token create --role WORKSPACE_OWNER --expiration 1h)

List DAGs

curl -s "https://$AIRFLOW_HOST/api/v1/dags" \
  -H "Authorization: Bearer $TOKEN" | jq

Trigger a DAG run

curl -s -X POST \
  "https://$AIRFLOW_HOST/api/v1/dags/my_dag/dagRuns" \
  -H "Authorization: Bearer $TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "dag_run_id": "manual_$(date -u +%FT%H%MZ)",
    "conf": {"key": "value"}
  }'

Get task instance state

curl -s \
  "https://$AIRFLOW_HOST/api/v1/dags/my_dag/dagRuns/<run_id>/taskInstances/<task_id>" \
  -H "Authorization: Bearer $TOKEN" | jq

Run history

curl -s \
  "https://$AIRFLOW_HOST/api/v1/dags/my_dag/dagRuns?limit=10&order_by=-start_date" \
  -H "Authorization: Bearer $TOKEN" | jq

Common recipes

Re-run a failed task

# Just the task (no upstream, no downstream)
airflow tasks run my_dag my_task 2026-04-21

# Clear state and re-run from the UI:
# UI → DAG → Grid → click failed task → Clear → (Include Downstream if needed)

Backfill a date range

airflow dags backfill my_dag \
  --start-date 2026-04-20 --end-date 2026-04-21 \
  --rerun-failed-tasks

Re-render a DAG definition

When you change DAG code, Airflow auto-re-parses within min_file_process_interval (default 30s). Force it:

airflow dags reserialize my_dag

Inspect the parse time

Slow DAG parsing is a common drag on scheduler throughput:

airflow dags report

Shows parse time per DAG file. If one DAG parses in > 10 seconds, fix it: move heavy imports out of top-level, split the DAG into smaller files.

See also