So I have some historical data on S3 in .csv/.parquet format. Everyday I have my batch job running which gives me 2 files having the list of data that needs to be deleted from the historical snapshot, and the new records that needs to be inserted to the historical snapshot. I cannot run insert/delete queries on athena. What are the options (cost effective and managed by aws) do I have to execute my problem?
-
Please let me know if/how your situation is different to the linked answer and we can reopen the question. – John Rotenstein Aug 11 '20 at 00:06
-
The above question is about running delete queries on athena, my question is about what options do I have to update S3 data in AWS. It could be Athena, DynamoDb, etc. I don't think these 2 are duplicate questions. – Sanu Bhattacharya Aug 11 '20 at 05:12
-
Similar to: [amazon web services - Can I delete data (rows in tables) from Athena? - Stack Overflow](https://stackoverflow.com/questions/48815504/can-i-delete-data-rows-in-tables-from-athena) – John Rotenstein Aug 11 '20 at 05:46
1 Answers
Objects in Amazon S3 are immutable. This means that be replaced, but they cannot be edited.
Amazon Athena, Amazon Redshift Spectrum and Hive/Hadoop can query data stored in Amazon S3. They typically look in a supplied path and load all files under that path, including sub-directories.
To add data to such data stores, simply upload an additional object in the given path.
To delete all data in one object, delete the object.
However, if you wish to delete data within an object, then you will need to replace the object with a new object that has those rows removed. This must be done outside of S3. Amazon S3 cannot edit the contents of an object.
Data Bricks has a product called Delta Lake that can add an additional layer between queries tools and Amazon S3:
Delta Lake is an open source storage layer that brings reliability to data lakes. Delta Lake provides ACID transactions, scalable metadata handling, and unifies streaming and batch data processing. Delta Lake runs on top of your existing data lake and is fully compatible with Apache Spark APIs.
Delta Lake supports deleting data from a table because it sits "in front of" Amazon S3.

- 241,921
- 22
- 380
- 470