I prefer the command-line over web interfaces. I want to run dags and watch the log output in the terminal. Each time an Airflow task is run, a new timestamped directory and file is created. Something like:


This makes it hard to tail-follow the logs. Thankfully, starting from Airflow 1.9, logging can be configured easily, allowing you to put all of a dag’s logs into one file.

Note: If you make this change, you won’t be able to view task logs in the web UI, only in the terminal.

Easy Solution (Airflow 1.10 only)

Set the FILENAME_TEMPLATE setting.

export AIRFLOW__CORE__LOG_FILENAME_TEMPLATE="{{ ti.dag_id }}.log"

Since Airflow 1.9, logging is configured pythonically.

Grab Airflow’s default log config,, and copy it somewhere in your PYTHONPATH.

curl -O

Set the logging_config_class setting. (Make sure this is set in both your scheduler and worker’s environments). (Alternatively set the related setting in airflow.cfg.)


Now you can configure logging to your liking.

Edit, changing FILENAME_TEMPLATE to:

FILENAME_TEMPLATE = '{{ ti.dag_id }}.log'

You should now get all of a dag log output in a single file.

Tailing the logs

Start the scheduler and trigger a dag.

$ airflow scheduler
$ airflow trigger_dag my-dag

Watch the output with tail -f.

$ tail -f ~/airflow/logs/my-dag.log