1

I followed the instructions described in the below link and now airflow logs are available on S3.

setting up s3 for logs in airflow

However I noticed that airflow continues to store the logs also on the local machine.

Is there a way to disable the local logging completely and store them only on S3? Did I miss any configuration to achieve this?

Thanks in advance for your help.

oved_s
  • 523
  • 1
  • 4
  • 7
  • 2
    Airflow stores your logs on your server whilst the task is running. This is what enables you to see your logs in 'real time' in the Airflow GUI. Your logs will only appear in S3 _after_ the task completes. I'm not aware of a way of skipping storing them on the local server. – RobinL Sep 22 '18 at 10:35
  • The logs in the UI are shown from S3. It is rarely that we need live logging since the processes are running during the night. The reason to change the location is to save storage on the airflow machine. – oved_s Sep 23 '18 at 08:07
  • @oved_s in my case, I can see that the logs of dags are written on s3 but when I open the log from the webserver, the logs are fetched from my worker machines, and not from s3 bucket (airflow 1.9.0). Do you have an idea about the reason? – Maurice Amar Nov 18 '19 at 21:39

0 Answers0