0

I have list of files from S3 which I am reading using following. Here I have list of files(keys). I would like to further transfer these outputs to JSON files for each individual key from S3.

I am not finding the ideal way to do so. Could you recommend how I can achieve that.

import json
import boto3
from io import BytesIO
import gzip

try:
     s3 = session.resource('s3')
     key='00604T143000Z_20200604T143500Z.log.gz'
     obj = s3.Object('my_bucket',key)
     n = obj.get()['Body'].read()
     gzipfile = BytesIO(n)
     gzipfile = gzip.GzipFile(fileobj=gzipfile)
     content = gzipfile.read()
     print(content)
except Exception as e:
    print(e)
    raise e
CP23
  • 1
  • 2

1 Answers1

0

Well try this

import json
with open("file.json" , "w") as file:
    json.dump(Data, file)

Data must be a dictionary I believe

For more on json

https://docs.python.org/3/library/json.html

For more on dictionaries

https://www.programiz.com/python-programming/dictionary#:~:text=Python%20dictionary%20is%20an%20unordered,when%20the%20key%20is%20known.

12ksins
  • 307
  • 1
  • 12