0

After running the tasks, logs are getting written to GCS bucket. But looks like logs are not able to read from GCS bucket and showing the below error.

*** Unable to read remote log from gs://*************-dev/example_bash_operator/run_after_loop/2021-07-19T06:59:32.870061+00:00/1.log ***** expected string or bytes-like object**

*** Trying to get logs (last 100 lines) from worker pod examplebashoperatorrunafterloop-77043f1ceee34f4ab92bfa3c0cf5bcd ***

*** Unable to fetch logs from worker pod examplebashoperatorrunafterloop-77043f1ceee34f4ab92bfa3c0cf5bcd *** (404) Reason: Not Found

Thanks in advance.

  • Does this answer your question? [Google Cloud Composer (Apache Airflow) cannot access log files](https://stackoverflow.com/questions/61334954/google-cloud-composer-apache-airflow-cannot-access-log-files) – SANN3 Jul 20 '21 at 05:55
  • This might help you. https://stackoverflow.com/questions/49381634/airflow-remote-logging-connections-airflow-1-7-1-3 – GRS Jul 21 '21 at 21:22

2 Answers2

0

Remote logging to Google Cloud Storage using an existing Airflow connection to read or write logs fails. If you don't have a connection properly set up, it’s recommended to update to the latest version to exclude these errors.

There is another workaround stated in another post that has a similar scenario as yours. Please take a look for the accepted answer and compare the configuration looking for the connection name:

[core]

remote_log_conn_id = google_cloud_default

Also make sure that the connection has the correct permissions to access the GCS bucket with the credentials that are used for it.

Goli Nikitha
  • 858
  • 3
  • 9
0

Makesure to add the service account key that have all the permissions, add the key to the below connection id in the form of json or txt in the airflow connections and add the below in the airflow.cfg file remote_log_conn_id = google_cloud_default

restart the airflow dags you can find the logs in the GCP bucket