20

I have an Application which requires data from Service2, which will return the same answer for a given request, forever, unless its backing database is updated. The database is updated very rarely, let's say twice per year.

I would like to design a solution so that the Application caches the answers from Service2, but to externally provide a feature so as to invalidate the cache of Application. I thought of exposing a RESTful webservice from the Application, but I am confused on how to design it correctly.

/application/cache/invalidate is a non RESTful URL - I was thinking about /application/cache/ to be called with HTTP POST. However, it looks to me that for a proper RESTful design, when POST is used to update a resource, the content to update should be contained in the body of the request.

What is the right way to design a "InvalidateCache" restful webservice?

Eric
  • 95,302
  • 53
  • 242
  • 374
Edmondo
  • 19,559
  • 13
  • 62
  • 115
  • see also https://stackoverflow.com/questions/53324538/rest-low-latency-how-should-i-reply-to-a-get-while-an-upload-is-pending – Bruce Adams Dec 04 '18 at 10:12

2 Answers2

20

Consider using DELETE instead of POST and for the url:

/application/cache/ 

In REST, both PUT and DELETEare considered to be indempotent actions. That is, they can be repeated multiple times with the same resulting resource state. In this case, your cache is the resource and multiple DELETE's will result in the same state, a cleared cache.

You could consider adding a descriptor to your url to clarify that you are clearing the contents of your cache and not deleting the cache object itself. Something like

/application/cache/contents

perhaps, but that is up to you. Going that route could also potentially let you selectively delete from your cache if necessary.

Adam S
  • 3,065
  • 1
  • 37
  • 45
  • 1
    Excellent ! Is it REST compliant to have the cache auto regenerate itself after a DELETE has been issued? – Edmondo Nov 09 '12 at 10:01
  • Yes, there is nothing to stop another actor from modifying the cache. Looking at it another way, say you exposed a PUT of values into the cache and a PUT happened immediately following a DELETE. After that sequence, the cache would not be empty, but the result of each individual REST action would be valid. – Adam S Nov 09 '12 at 15:36
  • What I've always wondered is how to properly support an Admin portal that wants data in real time while also supporting the customer facing sites that should be getting cached data. – The Muffin Man Dec 04 '16 at 18:53
7

This might not answer your question directly but you may also want to look into HTTP ETags, which are well suited for caching in RESTful designs.

The idea would be that Application would GET a resource from Service2, which would return the resource along with a ETag header (it could be a last modified timestamp or a hash). Application would then cache that resource along with the ETag.

When Application needs to work with the resource again, it can send a HTTP GET to Service2 with the ETag header it has in cache.

  • If Service2 finds that the resource has not been changed since it was returned to Application the first time, it returns an empty response with HTTP status 304 (not modified) indicating that Application can use the data in cache.
  • If the data has been updated, Service2 returns the new resource with a new ETag header (and HTTP status 200).

This approach works well if you don't mind the HTTP GET to see if the resource changed and if it's easy to Service2 can determine whether the resource has changed (without having to load it).

The advantage is that Service2 doesn't have to invalidate the cache of it clients (Application), which might not be a very good practice (and could be hard to do if you have a lot of clients).

Stephan
  • 41,764
  • 65
  • 238
  • 329
Christophe L
  • 13,725
  • 6
  • 33
  • 33