I'm having the following dilemma, implementing LRU algorithm. For every entry I have to store (key, value, timestamp) As you guessed the cache structure can be :
cache[key] = [ value, tstamp ]
so far so good. The problem is that LRU algorithm requires that when I have to purge element (because the cache is full), I have to do linear search on the time stamp to delete the least used item, which is slow.
On the other hand I can't build reverse hash tstamp => key, because the key is 1KByte big and the additional memory usage will negate the use of the cache.
Any ideas to preserve the speed but not increase the memory usage ?
If I could have something as reference I could probably reference cache-key w/o allocating additional memory !
I can't use LRU lib, because the key is unhashable by default I have to transform it beforehand. I'm also using 2.7