I would like to batch upload a json file to dynamodb. At the moment I can successfully manually put items in a python file (as below) and upload to a table, however how can I amend the script to read an external json file (containing 200 items) and batch upload all 200 items to the table.
import boto3
dynamodb = boto3.resource('dynamodb')
table = dynamodb.Table('exampletable')
with table.batch_writer() as batch:
batch.put_item(
Item={
'ID': '2',
'DateTime': '21/12/2017 13:16',
'SourceDevice': '10',
'DestinationDevice': '20',
'DataType': 'full',
'Activity': 'unusual'
}
)
batch.put_item(
Item={
'ID': '3',
'DateTime': '21/12/2017 13:40',
'SourceDevice': '10',
'DestinationDevice': '20',
'DataType': 'full',
'Activity': 'unusual'
}
)
json file contents as below
[{
"ID": "1",
"DateTime": "21/12/2017 13:16",
"SourceDevice": "10",
"DestinationDevice": "20",
"DataType": "part",
"Activity": "normal"
}, {
"ID": "1",
"DateTime": "21/12/2017 13:16",
"SourceDevice": "40",
"DestinationDevice": "25",
"DataType": "full",
"Activity": "unusual"
}]