I have multiple large csv files in a S3 bucket. I want to write their data to a dynamoDB table. The issue is my function runs for more than 15 minutes and get the timeout error without completely writing the csv filevto DynamoDB. So is there a way to split the csv into smaller parts?
Things I've tried so far
this - This doesn't invoke itself as it is supposed to be(writes a few lines to the table then stops without any errors.
aws document - Gives s3fs module not found error. Tried many many things to make it work but couldn't.
Is there anyway I can do my task?
Thank You