0

Has anyone here sent Airflow logs to S3?

I am running airflow version 1.10.9, and I am trying to send my logs to S3. I followed the official documentation's instructions written here but they don't seem to be working at all.
I tried restarting Airflow scheduler & web-server as well.

If someone has done it can they tell me how did they do it?

The connection I created for S3:

Conn Id: S3Connection
Conn Type: S3
Schema: {"aws_access_key_id":"some-key", "aws_secret_access_key": "some-key"}

>All other fields were left blank.

My airflow.cfg has this:

remote_logging = True
remote_log_conn_id = S3Connection
remote_base_log_folder = s3://log-bucket/qa
encrypt_s3_logs = False

>Note that only log-bucket exists right now, the key I assume should be automatically created when something is uploaded there.

I tried this answer & this answer as well, but that did not work for me either.

saadi
  • 646
  • 6
  • 29

1 Answers1

2

One potential issue, is that your json representation of the connection details that you have entered into Schema looks like it may actually be for the Extra field on the Connection field / form.

Try updating your connection accordingly.

Mike Taylor
  • 689
  • 3
  • 8
  • hi, I am able to send logs to S3 now but they are storing a copy on the local disk too, how do I stop that? – saadi Sep 22 '20 at 07:13