2

I am trying to delete all items from an aws dynamodb table by hashkey. I see alot of discussion about it on the internet but no actual code samples and all my attempts have failed.

What am I doing wrong here or is there a better apporach altogether?

List<Document> results = null;
var client = new AmazonDynamoDBClient("myamazonkey", "myamazonsecret");
var table = Table.LoadTable(client, "mytable");

var search = table.Query(new Primitive("hashkey"), new RangeFilter());
do {
results = search.GetNextSet();
search.Matches.Clear();
foreach (var r in results)
                    {
                        client.DeleteItem(new DeleteItemRequest()
                            {
                                Key = new Key() { HashKeyElement = new AttributeValue() { S = "hashkey"}, RangeKeyElement = new AttributeValue() { N = r["range-key"].AsString() } }
                            });
                    }
} while(results.Count > 0);
brendan
  • 29,308
  • 20
  • 68
  • 109
  • Does this answer your question? [Delete large data with same partition key from DynamoDB](https://stackoverflow.com/questions/49684100/delete-large-data-with-same-partition-key-from-dynamodb) – Liam Dec 09 '21 at 10:57

1 Answers1

2

Problem solved using the AWS batch write functionality. I chop mine into batches of 25 but I think the api can take up to 100.

List<Document> results = null;
var client = new AmazonDynamoDBClient("myamazonkey", "myamazonsecret");
var table = Table.LoadTable(client, "mytable");
var batchWrite = table.CreateBatchWrite();
var batchCount = 0;

var search = table.Query(new Primitive("hashkey"), new RangeFilter());
do {
  results = search.GetNextSet();
  search.Matches.Clear();
  foreach (var document in results)
  {
    batchWrite.AddItemToDelete(document);
    batchCount++;
    if (batchCount%25 == 0)
    {
      batchCount = 0;
      try
      {
        batchWrite.Execute();
      }
      catch (Exception exception)
      {
       Console.WriteLine("Encountered an Amazon Exception {0}", exception);
      }

      batchWrite = table.CreateBatchWrite();
     }
   }
   if (batchCount > 0) batchWrite.Execute();
  }
} while(results.Count > 0);
brendan
  • 29,308
  • 20
  • 68
  • 109
  • Can you confirm that the API may be able to take 100 items? According to [these docs](http://docs.aws.amazon.com/amazondynamodb/latest/APIReference/API_BatchWriteItem.html) I believe it can only do 25 like your code sample. – David Sulpy Feb 10 '15 at 13:37
  • Yeah it looks like get will allow 100 but write is limited to 25. – brendan Mar 24 '15 at 14:35