0

when my airflow dag fails i get the errors in the path "/home/ec2-user/software/airflow/logs/dagtest_dag/trigger_load/2019-10-10T06:01:33.342433+00:00/1.log "

how to take these log to s3 bucket?

  • You can fire up a service to package your logs and post them to s3. – Kshitij Saxena Oct 11 '19 at 12:02
  • 1
    There's this module called `boto` in python which can do this, check this out https://stackoverflow.com/questions/15085864/how-to-upload-a-file-to-directory-in-s3-bucket-using-boto – Sawant Sharma Oct 11 '19 at 12:04

1 Answers1

0

Configure this as a cron job.

import boto3

s3 = boto3.client('s3')
# Make sure your client is authenticated
with open('path/to/your/logs.log', 'rb') as data:
    s3.upload_fileobj(data, 'bucketname', 'path/to/your/logs/in/s3.log')
Kshitij Saxena
  • 930
  • 8
  • 19