2

I have use case where I write data in Dynamo db in two table say t1 and t2 in transaction.My app needs to read data from these tables lot of times (1 write, at least 4 reads). I am considering DAX vs Elastic Cache. Anyone has any suggestions? Thanks in advance K

LeoMurillo
  • 6,048
  • 1
  • 19
  • 34
karuna
  • 53
  • 1
  • 1
  • 6

5 Answers5

11

ElastiCache is not intended for use with DynamoDB.

DAX is good for read-heavy apps, like yours. But be aware that DAX is only good for eventually consistent reads, so don't use it with banking apps, etc. where the info always needs to be perfectly up to date. Without further info it's hard to tell more, these are just two general points to consider.

Amazon DynamoDB Accelerator (DAX) is a fully managed, highly available, in-memory cache that can reduce Amazon DynamoDB response times from milliseconds to microseconds, even at millions of requests per second. While DynamoDB offers consistent single-digit millisecond latency, DynamoDB with DAX takes performance to the next level with response times in microseconds for millions of requests per second for read-heavy workloads. With DAX, your applications remain fast and responsive, even when a popular event or news story drives unprecedented request volumes your way. No tuning required. https://aws.amazon.com/dynamodb/dax/

Boommeister
  • 1,591
  • 2
  • 15
  • 54
2

AWS recommends that you use **DAX as solution for this requirement. Elastic Cache is an old method and it is used to store the session states in addition to the cache data.

DAX is extensively used for intensive reads through eventual consistent reads and for latency sensitive applications. Also DAX stores cache using these parameters:-

  1. Item cache - populated with items with based on GetItem results.
  2. Query cache - based on parameters used while using query or scan method

Cheers!

Phani Teja
  • 163
  • 9
1

I'd recommend to use DAX with DynamoDB, provided you're having more read calls using item level API (and NOT query level API), such as GetItem API.

Why? DAX has one weird behavior as follows. From, AWS,

"Every write to DAX alters the state of the item cache. However, writes to the item cache don't affect the query cache. (The DAX item cache and query cache serve different purposes, and operate independently from one another.)"

Hence, If I elaborate, If your query operation is cached, and thereafter if you've write operation that affect's result of previously cached query and if same is not yet expired, in that case your query cache result would be outdated.

This out of sync issue, is also discussed here.

Tejaskumar
  • 754
  • 8
  • 24
1

I find DAX useful only for cached queries, put item and get item. In general very difficult to find a use case for it.

DAX separates queries, scans from CRUD for individual items. That means, if you update an item and then do a query/scan, it will not reflect changes.

You can't invalidate cache, it only invalidates when ttl is reached or nodes memory is full and it is dropping old items.

Take Aways:

  1. doing puts/updates and then queries - two seperate caches so out of sync
  2. looking for single item - you are left only with primary key and default index and getItem request (no query and limit 1). You can't use any indexes for gets/updates/deletes.
  3. Using ConsistentRead option when using query to get latest data - it works, but only for primary index.
  4. Writing through DAX is slower than writing directly to Dynamodb since you have a hop in the middle.
  5. XRay does not work with DAX

Use Case

  1. You have queries that you don't really care they are not up to date
  2. You are doing few putItem/updateItem and a lot of getItem
Townsheriff
  • 569
  • 6
  • 14
  • DAX is write-through. So if you perform an update through DAX, it'll get updated both on the DAX cache and on the DDB. In that sense there is no reason to invalidate cache (unless a conflict with last writer wins wasn't resolved properly). It is however eventually consistent. – Kevin Van Ryckegem Jul 25 '22 at 09:32
  • 1
    While this is true for putItem and getItem, it is not the case for the query cache - updating/creating/deleting an item will not update cached dynamodb query and vice versa. See the docs - https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/DAX.consistency.html#DAX.consistency.query-cache – Townsheriff Jul 25 '22 at 14:56
  • I appreciate this alternate take among the rest of the answers – rpivovar Sep 25 '22 at 13:51
0

DAX provides eventually consistent read access to applications with single-digit millisecond performance at any scale which means it can serve millions of requests per second. Read heavy applications that require fastest/real time access response time & do not care about strong consistency like gaming applications should use DAX.

DAX is not a good solution when you have, Write heavy application Application that require strongly consistent read ,such as banking or transaction based application Application where accuracy isn’t as big concern as latency of data.Though data read is mostly correct.

ElasticCache (Redis) on the other hand is best as you can use it with RDS , Aurora , Self hosted databases on EC2 , MongoDB etc. & ElasticCache (Memcached) is ideal for simple application running on few servers.

Since its a read heavy application with DynamoDB, unless you need strongly consistent reads, DAX is a best choice.

Vidz Here
  • 101
  • 1
  • 2
  • 8