I am trying to upload a text file to a S3 bucket from a pyhton code using S3FS library to connect and upload to AWS. But getting the below error when I try uploading the content.
warnings.warn(
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/s3fs/core.py", line 112, in _error_wrapper
return await func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/aiobotocore/client.py", line 358, in _make_api_call
raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (MalformedXML) when calling the PutObject operation: The XML you provided was not well-formed or did not validate against our published schema
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/ubuntu/airflow/twitter_dag/twitter_etl.py", line 46, in <module>
run_twitter_etl()
self.commit()
File "/usr/local/lib/python3.10/dist-packages/s3fs/core.py", line 2188, in commit
write_result = self._call_s3(
File "/usr/local/lib/python3.10/dist-packages/s3fs/core.py", line 2040, in _call_s3
return self.fs.call_s3(method, self.s3_additional_kwargs, *kwarglist, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/fsspec/asyn.py", line 113, in wrapper
return sync(self.loop, func, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/fsspec/asyn.py", line 98, in sync
raise return_result
File "/usr/local/lib/python3.10/dist-packages/fsspec/asyn.py", line 53, in _runner
result[0] = await coro
File "/usr/local/lib/python3.10/dist-packages/s3fs/core.py", line 339, in _call_s3
return await _error_wrapper(
File "/usr/local/lib/python3.10/dist-packages/s3fs/core.py", line 139, in _error_wrapper
raise err
OSError: [Errno 22] The XML you provided was not well-formed or did not validate against our published schema
Code:
s3 = s3fs.S3FileSystem(anon=False)
with s3.open(f"<bucket-name>/<filename>.txt", 'w') as f:
f.write('Hello')
The code was running from a EC2 instance with provided role access. I tested it by running aws s3 ls
cmd which returned my list of buckets and was able to access the provided bucket.