I have a deployed DAG in which I'm using check_for_wildcard_key() to check if files for a particular day are present in an s3 location and then decide which branch to return. following is the code :
from airflow.hooks import S3_hook
def checkforfiles(**kwargs):
hook = S3_hook.S3Hook('s3_dummy')
if hook.check_for_wildcard_key(f"s3://bucket/fixed/path/that/stays/the/same/daily/{sub_path_that_changes_daily}/*"):
return 'branch1
else:
return 'end'
The problem is that even though the files are present at their locations, 'end' is returned all the time. I want to test what's happening here in my local system since there is virtually no useful logging in airflow. I have the acees_key and secret_key for the bucket, how do I pass them to this s3_hook? As of now I get the following error :
AirflowNotFoundException: The conn_id `s3_dummy` isn't defined
I tried using mock but couldn't figure out a way to get it to work. Any help would be appreciated. *Edit : Although I found the bug in my code without the need of a mock connection, but let's keep this thread open to help others in need.