Trying to write "hello world" into an airflow log (airflow 1.10.3). Based on the SO solutions presented here and here I should be able to just import logging
and logging.info('hello world')
. That doesn't seem to work for me.
import logging
from datetime import datetime, timedelta
from airflow import DAG
from airflow.operators.python_operator import PythonOperator
default_args = {
'owner': 'benten',
'depends_on_past': False,
'start_date': datetime(2019, 7, 25),
'email_on_failure': False,
'retries': 1,
'retry_delay': timedelta(minutes=1),
}
def logging_is_fun():
logging.debug("hellow world")
logging.info("hello world")
logging.critical("hello world")
return None
with DAG('fun_logs', schedule_interval='45 * * * *', default_args=default_args) as dag:
log_task = PythonOperator(python_callable=logging_is_fun, task_id='log_test_task')
I trigger the dag manually and the task executes with no problems. But alas when I check the logs all I see is this:
*** Reading local file: /home/ubuntu/airflow/logs/fun_logs/log_test_task/2019-08-31T19:22:49.653712+00:00/1.log
Where are my amazing "hello world" statements? I don't expect to see all of them, given my log level setting. I do expect to see the critical message, though.
My airflow.cfg has the following in it (all default settings to the best of my knowledge):
# The folder where airflow should store its log files
# This path must be absolute
base_log_folder = /home/ubuntu/airflow/logs
# Airflow can store logs remotely in AWS S3, Google Cloud Storage or Elastic Search.
# Users must supply an Airflow connection id that provides access to the storage
# location. If remote_logging is set to true, see UPDATING.md for additional
# configuration requirements.
remote_logging = False
remote_log_conn_id =
remote_base_log_folder =
encrypt_s3_logs = False
# Logging level
logging_level = WARN
fab_logging_level = WARN
# Logging class
# Specify the class that will specify the logging configuration
# This class has to be on the python classpath
# logging_config_class = my.path.default_local_settings.LOGGING_CONFIG
logging_config_class =
# Log format
log_format = [%%(asctime)s] {%%(filename)s:%%(lineno)d} %%(levelname)s - %%(message)s
simple_log_format = %%(asctime)s %%(levelname)s - %%(message)s
# Log filename format
log_filename_template = {{ ti.dag_id }}/{{ ti.task_id }}/{{ ts }}/{{ try_number }}.log
log_processor_filename_template = {{ filename }}.log
dag_processor_manager_log_location = /home/ubuntu/airflow/logs/dag_processor_manager/dag_processor_manager.log