0

I'm having the following dilemma, implementing LRU algorithm. For every entry I have to store (key, value, timestamp) As you guessed the cache structure can be :

 cache[key] = [ value, tstamp ]

so far so good. The problem is that LRU algorithm requires that when I have to purge element (because the cache is full), I have to do linear search on the time stamp to delete the least used item, which is slow.

On the other hand I can't build reverse hash tstamp => key, because the key is 1KByte big and the additional memory usage will negate the use of the cache.

Any ideas to preserve the speed but not increase the memory usage ?

If I could have something as reference I could probably reference cache-key w/o allocating additional memory !


I can't use LRU lib, because the key is unhashable by default I have to transform it beforehand. I'm also using 2.7

sten
  • 7,028
  • 9
  • 41
  • 63

1 Answers1

0

If you want to use a out-of-box LRU cache, refer to functools.lru_cache

If you need to implement it yourself, use a linked list to keep track of the function call history, please refer to my other answer regards to how functools.lru_cache works internally.