0

I am trying to upload the json file to Wasabi s3 bucket but couldn't make it to work goal was to upload the json file to the s3 bucket when spider is finished. I have created the policies with WasabiFullAccess and AmazonS3FullAccess and in scrapy settings.py I have:

FEEDS = (
    {
        "https://s3.ap-southeast-2.wasabisys.com/bucketname/%(name)s/%(name)s_%(time)s.json": {
            "format": "json",
        }
    },
)

aws_access_key_id = "MY_ACCESS_KEY"
aws_secret_access_key = "MY_SECRET_KEY"

But this is not working can't make it to bucket. I have installed boto as well. Is there something I am missing here? I have searched almost every search engine but couldn't find any solution or any working example with wasabi s3 scrapy pipeline. Any help or hints would be much appreciated. Thanks!

X-somtheing
  • 219
  • 2
  • 10
  • Have you setup the correct scrapy pipeline to handle all the items yielded by the spider? – E Joseph Oct 30 '22 at 20:53
  • I didn't created the custom pipeline when I asked that question but now I have created the pipeline and now I have different issue so I asked a new question if possible can you please take a look at it may be you can figure out https://stackoverflow.com/questions/74257188/botocore-exceptions-clienterror-an-error-occurred-invalidaccesskeyid-when-cal – X-somtheing Oct 30 '22 at 22:30

0 Answers0