0

I am trying to send several hundred thousand JSON objects to a database via API post calls. The problem is that when I try to send all those JSON objects to the database, with seperate calls for each object, I get a lot of HTTP errors, mostly 400.

My question is, is there a better way to send all the JSON objects to the databse? Could I send them all at once? Wait between API calls?

I do not if it's important but I'm trying to send the data to a DynamoDB via a AWS API.

  • An error `400` can be caused by all sorts of issues because it's such a generic code (client error). I think more information is needed - what does your API look like? What object are you sending? What object (schema) does you DB expect? etc. – jonny Apr 29 '18 at 17:41

2 Answers2

2

You can write items in batches. See:

You could also provision more capacity in DynamoDB. This question about throttling in DynamoDB could be useful. That might be the reason why you're getting HTTP 400 errors (more details about the HTTP error message you get would be useful).

Guido
  • 46,642
  • 28
  • 120
  • 174
0

The documentation here states that an error 400 can be a ThrottlingException The request was denied due to request throttling.

If this is the case, you have three choices.

1) Limit the number of individual POST calls that are sent to the API per second from your code via a queuing system.

2) As recommended in a previous answer, you could use the batch functionality.

3) Increase the capacity of the table. This can be achieved via the console by selecting the table, going into the capacity tab and increasing the Write capacity units

TidyDev
  • 3,470
  • 9
  • 29
  • 51